AURA

JSGG

AuraJS
DOCSEXAMPLESGITHUB
Native Media Presentation Contract
Exact authored native media presentation behavior for screen-plus-audio flows.
docs/native-media-presentation-contract-v1.md

Native Media Presentation Contract v1

Scope

AuraJS now ships a native-only authored media-presentation path for a separate:

  • video asset
  • audio asset
  • JS-owned cues and subtitles
  • in-world or screen-space presentation surface

The contract is intentionally narrow:

  • native desktop only
  • separate authored audio asset orchestration
  • JS-first control through @auraindustry/aurajs/cutscene
  • runtime-backed aura.video.createPresentation(...)
  • runtime-backed aura.audio.createSpatialEmitter(...)

This contract does not claim:

  • browser/video parity
  • muxed container audio playback guarantees
  • engine-owned subtitle rendering
  • editor/timeline authoring
  • perfect audio/video lockstep owned by the engine

Canonical Authored Path

For authored game code, the canonical path is:

  • createMediaPresentationController(runtime, source, options?)
  • createVideoBillboardSurface(videoRuntime, source, options?)
  • createSpatialAudioEmitter(audioRuntime, path, options?)

The runtime still exposes:

  • aura.video.createPresentation(source, options?)
  • aura.video.createBillboardSurface(source, options?)
  • aura.audio.createSpatialEmitter(path, options?)

The JS helper layer is the preferred product surface because it owns:

  • cue sequencing
  • checkpoint export/restore
  • authored seek behavior
  • separate-audio restart semantics
  • combined presentation state summaries

Accepted Inputs

Video:

  • native desktop .mp4
  • frame-sequence directories
  • sprite-sheet playback sources already accepted by aura.video

Audio:

  • the existing committed aura.audio.play(...) asset formats
  • native media presentation is normally authored with a separate .wav, .ogg, .flac, or .mp3 file

Authored Semantics

The authored presentation path supports:

  • play()
  • pause()
  • resume()
  • stop()
  • seek(timeSecs, options?)
  • attach(target, options?)
  • update(dt?)
  • draw(options?)
  • unload()
  • getState()

The current honest behavior is:

  1. Video playback is runtime-owned through aura.video.
  2. Separate audio playback is runtime-owned through aura.audio.
  3. Coordination is JS-authored and deterministic.
  4. Seek behavior may restart or ignore audio depending on authored options.
  5. Checkpoint restore replays authored logic without pretending muxed A/V sync exists underneath.

Inspection Contract

The improved native surface must keep these state families inspectable:

  • presentation state
  • billboard-surface state
  • spatial-emitter state
  • listener state
  • combined authored helper state

The shipped authored helper now exposes a combined view through createMediaPresentationController(...).getState() that includes:

  • presentation
  • surface
  • emitter
  • media
  • cutscene
  • audioSeekMode
  • lastAction
  • lastResult
  • lastAudioResync
  • unloaded

The native audio surface now also truthfully exposes:

  • aura.audio.getListenerState()
  • aura.audio.getSpatialState(handle?)

These are for deterministic debugging, proof scenes, and runtime observability. They are not a promise of editor tooling.

Seek And Audio Truth

AuraJS still does not claim engine-managed synchronized muxed A/V.

Instead, the authored contract today is:

  • video seek is runtime-owned
  • separate audio may be restarted or left alone
  • the chosen behavior is surfaced in state and reason-coded results

Recommended authored mode today:

  • use audioSeekMode: "restart" when a presentation owns a separate ambient or local audio track and you want honest, visible resync behavior

Example

import { createMediaPresentationController } from '@auraindustry/aurajs/cutscene';

const presentation = createMediaPresentationController(aura, 'cutscenes/intro.mp4', {
  audioPath: 'audio/intro.ogg',
  audioSeekMode: 'restart',
  loadOptions: {
    type: 'mp4',
    looping: true,
  },
  audio: {
    volume: 0.12,
    loop: true,
    refDistance: 2.4,
    maxDistance: 16,
  },
  cues: [
    { time: 0.2, type: 'subtitle', text: 'Native media presentation is active' },
  ],
  onSubtitle(cue) {
    hud.subtitle = cue.text;
  },
});

presentation.play();
presentation.seek(0.05);

Non-Goals

This contract still leaves these frontiers open:

  • browser aura.video
  • muxed audio inside .mp4
  • subtitle/localization pipelines
  • cinematic timeline/editor tooling
  • deeper HRTF/occlusion authoring controls than the current runtime exposes
DOCUMENT REFERENCE
docs/native-media-presentation-contract-v1.md
AURAJS
Cmd/Ctrl+K
aurajsgg