The createMediaStreamSource()
method of the AudioContext
Interface is used to create a new MediaStreamAudioSourceNode
object, given a media stream (say, from a MediaDevices.getUserMedia
instance), the audio from which can then be played and manipulated.
For more details about media stream audio source nodes, check out the MediaStreamAudioSourceNode
reference page.
createMediaStreamSource(stream)
A new MediaStreamAudioSourceNode
object representing the audio node whose media is obtained from the specified source stream.
In this example, we grab a media (audio + video) stream from navigator.getUserMedia
, feed the media into a <video>
element to play then mute the audio, but then also feed the audio into a MediaStreamAudioSourceNode
. Next, we feed this source audio into a low pass BiquadFilterNode
(which effectively serves as a bass booster), then a AudioDestinationNode
.
The range slider below the <video>
element controls the amount of gain given to the lowpass filter — increase the value of the slider to make the audio sound more bass heavy!
const pre = document.querySelector("pre");
const video = document.querySelector("video");
const myScript = document.querySelector("script");
const range = document.querySelector("input");
if (navigator.mediaDevices) {
console.log("getUserMedia supported.");
navigator.mediaDevices
.getUserMedia({ audio: true, video: true })
.then((stream) => {
video.srcObject = stream;
video.onloadedmetadata = (e) => {
video.play();
video.muted = true;
};
const audioCtx = new AudioContext();
const source = audioCtx.createMediaStreamSource(stream);
const biquadFilter = audioCtx.createBiquadFilter();
biquadFilter.type = "lowshelf";
biquadFilter.frequency.value = 1000;
biquadFilter.gain.value = range.value;
source.connect(biquadFilter);
biquadFilter.connect(audioCtx.destination);
range.oninput = () => {
biquadFilter.gain.value = range.value;
};
})
.catch((err) => {
console.log(`The following gUM error occurred: ${err}`);
});
} else {
console.log("getUserMedia not supported on your browser!");
}
pre.innerHTML = myScript.innerHTML;
Note: As a consequence of calling createMediaStreamSource()
, audio playback from the media stream will be re-routed into the processing graph of the AudioContext
. So playing/pausing the stream can still be done through the media element API and the player controls.