W3cubDocs

/Web APIs

OfflineAudioContext: startRendering() method

The startRendering() method of the OfflineAudioContext Interface starts rendering the audio graph, taking into account the current connections and the current scheduled changes.

The complete event (of type OfflineAudioCompletionEvent) is raised when the rendering is finished, containing the resulting AudioBuffer in its renderedBuffer property.

Browsers currently support two versions of the startRendering() method — an older event-based version and a newer promise-based version. The former will eventually be removed, but currently both mechanisms are provided for legacy reasons.

Syntax

js

startRendering()

Parameters

None.

Return value

A Promise that fulfills with an AudioBuffer.

Examples

In this simple example, we declare both an AudioContext and an OfflineAudioContext object. We use the AudioContext to load an audio track via XHR (BaseAudioContext.decodeAudioData), then the OfflineAudioContext to render the audio into an AudioBufferSourceNode and play the track through. After the offline audio graph is set up, you need to render it to an AudioBuffer using OfflineAudioContext.startRendering.

When the startRendering() promise resolves, rendering has completed and the output AudioBuffer is returned out of the promise.

At this point we create another audio context, create an AudioBufferSourceNode inside it, and set its buffer to be equal to the promise AudioBuffer. This is then played as part of a simple standard audio graph.

Note: For a working example, see our offline-audio-context-promise GitHub repo (see the source code too.)

js

// define online and offline audio context

const audioCtx = new AudioContext();
const offlineCtx = new OfflineAudioContext(2, 44100 * 40, 44100);

source = offlineCtx.createBufferSource();

// use XHR to load an audio track, and
// decodeAudioData to decode it and OfflineAudioContext to render it

function getData() {
  request = new XMLHttpRequest();

  request.open("GET", "viper.ogg", true);

  request.responseType = "arraybuffer";

  request.onload = () => {
    const audioData = request.response;

    audioCtx.decodeAudioData(audioData, (buffer) => {
      myBuffer = buffer;
      source.buffer = myBuffer;
      source.connect(offlineCtx.destination);
      source.start();
      //source.loop = true;
      offlineCtx
        .startRendering()
        .then((renderedBuffer) => {
          console.log("Rendering completed successfully");
          const offlineAudioCtx = new AudioContext();
          const song = offlineAudioCtx.createBufferSource();
          song.buffer = renderedBuffer;

          song.connect(offlineAudioCtx.destination);

          play.onclick = () => {
            song.start();
          };
        })
        .catch((err) => {
          console.error(`Rendering failed: ${err}`);
          // Note: The promise should reject when startRendering is called a second time on an OfflineAudioContext
        });
    });
  };

  request.send();
}

// Run getData to start the process off

getData();

Specifications

Browser compatibility

Desktop Mobile
Chrome Edge Firefox Internet Explorer Opera Safari WebView Android Chrome Android Firefox for Android Opera Android Safari on IOS Samsung Internet
startRendering 25 12 25 No 15 7 4.4 25 25 14 7 1.5
returns_promise 42 ≤18 37 No 29 14.1 42 42 37 29 14.5 4.0

See also

© 2005–2023 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/OfflineAudioContext/startRendering