
AuraJS now ships a native-only authored media-presentation path for a separate:
The contract is intentionally narrow:
@auraindustry/aurajs/cutsceneaura.video.createPresentation(...)aura.audio.createSpatialEmitter(...)This contract does not claim:
For authored game code, the canonical path is:
createMediaPresentationController(runtime, source, options?)createVideoBillboardSurface(videoRuntime, source, options?)createSpatialAudioEmitter(audioRuntime, path, options?)The runtime still exposes:
aura.video.createPresentation(source, options?)aura.video.createBillboardSurface(source, options?)aura.audio.createSpatialEmitter(path, options?)The JS helper layer is the preferred product surface because it owns:
Video:
.mp4aura.videoAudio:
aura.audio.play(...) asset formats.wav, .ogg,
.flac, or .mp3 fileThe authored presentation path supports:
play()pause()resume()stop()seek(timeSecs, options?)attach(target, options?)update(dt?)draw(options?)unload()getState()The current honest behavior is:
aura.video.aura.audio.The improved native surface must keep these state families inspectable:
The shipped authored helper now exposes a combined view through
createMediaPresentationController(...).getState() that includes:
presentationsurfaceemittermediacutsceneaudioSeekModelastActionlastResultlastAudioResyncunloadedThe native audio surface now also truthfully exposes:
aura.audio.getListenerState()aura.audio.getSpatialState(handle?)These are for deterministic debugging, proof scenes, and runtime observability. They are not a promise of editor tooling.
AuraJS still does not claim engine-managed synchronized muxed A/V.
Instead, the authored contract today is:
Recommended authored mode today:
audioSeekMode: "restart" when a presentation owns a separate ambient or
local audio track and you want honest, visible resync behaviorimport { createMediaPresentationController } from '@auraindustry/aurajs/cutscene';
const presentation = createMediaPresentationController(aura, 'cutscenes/intro.mp4', {
audioPath: 'audio/intro.ogg',
audioSeekMode: 'restart',
loadOptions: {
type: 'mp4',
looping: true,
},
audio: {
volume: 0.12,
loop: true,
refDistance: 2.4,
maxDistance: 16,
},
cues: [
{ time: 0.2, type: 'subtitle', text: 'Native media presentation is active' },
],
onSubtitle(cue) {
hud.subtitle = cue.text;
},
});
presentation.play();
presentation.seek(0.05);
This contract still leaves these frontiers open:
aura.video.mp4