W3cubDocs

/Web APIs

AudioContext: createMediaStreamDestination() method

The createMediaStreamDestination() method of the AudioContext Interface is used to create a new MediaStreamAudioDestinationNode object associated with a WebRTC MediaStream representing an audio stream, which may be stored in a local file or sent to another computer.

The MediaStream is created when the node is created and is accessible via the MediaStreamAudioDestinationNode's stream attribute. This stream can be used in a similar way as a MediaStream obtained via navigator.getUserMedia — it can, for example, be sent to a remote peer using the addStream() method of RTCPeerConnection.

For more details about media stream destination nodes, check out the MediaStreamAudioDestinationNode reference page.

Syntax

js

createMediaStreamDestination()

Parameters

None.

Return value

Examples

In the following simple example, we create a MediaStreamAudioDestinationNode, an OscillatorNode and a MediaRecorder (the example will therefore only work in Firefox and Chrome at this time.) The MediaRecorder is set up to record information from the MediaStreamDestinationNode.

When the button is clicked, the oscillator starts, and the MediaRecorder is started. When the button is stopped, the oscillator and MediaRecorder both stop. Stopping the MediaRecorder causes the dataavailable event to fire, and the event data is pushed into the chunks array. After that, the stop event fires, a new blob is made of type opus — which contains the data in the chunks array, and a new window (tab) is then opened that points to a URL created from the blob.

From here, you can play and save the opus file.

html

<!doctype html>
<html lang="en-US">
  <head>
    <meta charset="UTF-8" />
    <title>createMediaStreamDestination() demo</title>
  </head>
  <body>
    <h1>createMediaStreamDestination() demo</h1>

    <p>Encoding a pure sine wave to an Opus file</p>
    <button>Make sine wave</button>
    <audio controls></audio>
    <script>
      const b = document.querySelector("button");
      let clicked = false;
      const chunks = [];
      const ac = new AudioContext();
      const osc = ac.createOscillator();
      const dest = ac.createMediaStreamDestination();
      const mediaRecorder = new MediaRecorder(dest.stream);
      osc.connect(dest);

      b.addEventListener("click", (e) => {
        if (!clicked) {
          mediaRecorder.start();
          osc.start(0);
          e.target.textContent = "Stop recording";
          clicked = true;
        } else {
          mediaRecorder.stop();
          osc.stop(0);
          e.target.disabled = true;
        }
      });

      mediaRecorder.ondataavailable = (evt) => {
        // Push each chunk (blobs) in an array
        chunks.push(evt.data);
      };

      mediaRecorder.onstop = (evt) => {
        // Make blob out of our blobs, and open it.
        const blob = new Blob(chunks, { type: "audio/ogg; codecs=opus" });
        document.querySelector("audio").src = URL.createObjectURL(blob);
      };
    </script>
  </body>
</html>

Note: You can view this example live, or study the source code, on GitHub.

Specifications

Browser compatibility

Desktop Mobile
Chrome Edge Firefox Internet Explorer Opera Safari WebView Android Chrome Android Firefox for Android Opera Android Safari on IOS Samsung Internet
createMediaStreamDestination 25 79 25 No 15 11 ≤37 25 25 14 11 1.5

See also

© 2005–2023 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/createMediaStreamDestination