This feature is well established and works across many devices and browser versions. It’s been available across browsers since April 2021.
* Some parts of this feature may have varying levels of support.
The MediaStream Recording API, sometimes referred to as the Media Recording API or the MediaRecorder API, is closely affiliated with the Media Capture and Streams API and the WebRTC API. The MediaStream Recording API makes it possible to capture the data generated by a MediaStream or HTMLMediaElement object for analysis, processing, or saving to disk. It's also surprisingly easy to work with.
The MediaStream Recording API is comprised of a single major interface, MediaRecorder, which does all the work of taking the data from a MediaStream and delivering it to you for processing. The data is delivered by a series of dataavailable events, already in the format you specify when creating the MediaRecorder. You can then process the data further or write it to file as desired.
The process of recording a stream is simple:
MediaStream or HTMLMediaElement (in the form of an <audio> or <video> element) to serve as the source of the media data.MediaRecorder object, specifying the source stream and any desired options (such as the container's MIME type or the desired bit rates of its tracks).ondataavailable to an event handler for the dataavailable event; this will be called whenever data is available for you.MediaRecorder.start() to begin recording.dataavailable event handler gets called every time there's data ready for you to do with as you will; the event has a data attribute whose value is a Blob that contains the media data. You can force a dataavailable event to occur, thereby delivering the latest sound to you so you can filter it, save it, or whatever.MediaRecorder.stop().Note: Individual Blobs containing slices of the recorded media will not necessarily be individually playable. The media needs to be reassembled before playback.
If anything goes wrong during recording, an error event is sent to the MediaRecorder. You can listen for error events by setting up a onerror event handler.
Example here, we use an HTML Canvas as source of the MediaStream, and stop recording after 9 seconds.
const canvas = document.querySelector("canvas");
// Optional frames per second argument.
const stream = canvas.captureStream(25);
const recordedChunks = [];
console.log(stream);
const options = { mimeType: "video/webm; codecs=vp9" };
const mediaRecorder = new MediaRecorder(stream, options);
mediaRecorder.ondataavailable = handleDataAvailable;
mediaRecorder.start();
function handleDataAvailable(event) {
console.log("data-available");
if (event.data.size > 0) {
recordedChunks.push(event.data);
console.log(recordedChunks);
download();
} else {
// …
}
}
function download() {
const blob = new Blob(recordedChunks, {
type: "video/webm",
});
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
document.body.appendChild(a);
a.style = "display: none";
a.href = url;
a.download = "test.webm";
a.click();
URL.revokeObjectURL(url);
}
// demo: to download after 9sec
setTimeout((event) => {
console.log("stopping");
mediaRecorder.stop();
}, 9000);
You can also use the properties of the MediaRecorder object to determine the state of the recording process, and its pause() and resume() methods to pause and resume recording of the source media.
If you need or want to check to see if a specific MIME type is supported, that's possible as well. Just call MediaRecorder.isTypeSupported().
If your goal is to record camera and/or microphone input, you may wish to examine the available input devices before beginning the process of constructing the MediaRecorder. To do so, you'll need to call navigator.mediaDevices.enumerateDevices() to get a list of the available media devices. You can then examine that list and identify the potential input sources, and even filter the list based on desired criteria.
In this code snippet, enumerateDevices() is used to examine the available input devices, locate those which are audio input devices, and create <option> elements that are then added to a <select> element representing an input source picker.
navigator.mediaDevices.enumerateDevices().then((devices) => {
devices.forEach((device) => {
const menu = document.getElementById("input-devices");
if (device.kind === "audioinput") {
const item = document.createElement("option");
item.textContent = device.label;
item.value = device.deviceId;
menu.appendChild(item);
}
});
});
Code similar to this can be used to let the user restrict the set of devices they wish to use.
To learn more about using the MediaStream Recording API, see Using the MediaStream Recording API, which shows how to use the API to record audio clips. A second article, Recording a media element, describes how to receive a stream from an <audio> or <video> element and use the captured stream (in this case, recording it and saving it to a local disk).
BlobEventEach time a chunk of media data is finished being recorded, it's delivered to consumers in Blob form using a BlobEvent of type dataavailable.
MediaRecorderThe primary interface that implements the MediaStream Recording API.
MediaRecorderErrorEvent Deprecated Non-standard
The interface that represents errors thrown by the MediaStream Recording API. Its error property is a DOMException that specifies that error occurred.
<button id="record-btn">Start</button> <video id="player" src="" autoplay controls></video>
const recordBtn = document.getElementById("record-btn");
const video = document.getElementById("player");
let chunks = [];
let isRecording = false;
let mediaRecorder = null;
const constraints = { video: true };
recordBtn.addEventListener("click", async () => {
if (!isRecording) {
// Acquire a recorder on load
if (!mediaRecorder) {
const stream = await navigator.mediaDevices.getUserMedia(constraints);
mediaRecorder = new MediaRecorder(stream);
mediaRecorder.addEventListener("dataavailable", (e) => {
console.log("data available");
chunks.push(e.data);
});
mediaRecorder.addEventListener("stop", (e) => {
console.log("onstop fired");
const blob = new Blob(chunks, { type: "video/ogv; codecs=opus" });
video.src = window.URL.createObjectURL(blob);
});
mediaRecorder.addEventListener("error", (e) => {
console.error("An error occurred:", e);
});
}
isRecording = true;
recordBtn.textContent = "Stop";
chunks = [];
mediaRecorder.start();
console.log("recorder started");
} else {
isRecording = false;
recordBtn.textContent = "Start";
mediaRecorder.stop();
console.log("recorder stopped");
}
});
| Specification |
|---|
| MediaStream Recording> |
| Desktop | Mobile | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Chrome | Edge | Firefox | Opera | Safari | Chrome Android | Firefox for Android | Opera Android | Safari on IOS | Samsung Internet | WebView Android | WebView on iOS | |
MediaRecorder |
47 | 79 | 25 | 36 | 14.1 | 47 | 25 | 36 | 14 | 5.0 | 47 | 14 |
MediaStream_Recording_API |
47 | 79 | 25Before Firefox 58, usingMediaStream.addTrack() on a stream obtained using getUserMedia(), then attempting to record the resulting stream would result in only recording the original stream without the added tracks (severe bug). |
36 | 14.1 | 47 | 25Before Firefox for Android 58, usingMediaStream.addTrack() on a stream obtained using getUserMedia(), then attempting to record the resulting stream would result in only recording the original stream without the added tracks (severe bug). |
36 | 14 | 5.0 | 47 | 14 |
audioBitrateMode |
89 | 89 | No | 75 | No | 89 | No | 63 | No | 15.0 | 89 | No |
audioBitsPerSecond |
49 | 79 | 71 | 36 | 14.1 | 49 | 79 | 36 | 14.5 | 5.0 | 49 | 14.5 |
dataavailable_event |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14 | 5.0 | 49 | 14 |
error_event |
25 | 14.1 | 25 | 14 | 14 | |||||||
isTypeSupported_static |
47 | 79 | 25 | 36 | 14.1 | 47 | 25 | 36 | 14 | 5.0 | 47 | 14 |
mimeType |
4947–49Before Chrome 49, only video is supported, not audio. |
79 | 25Starting with Firefox 71, the behavior ofmimeType is more consistent. For example, it now returns the media type even after recording has stopped. |
36 | 14.1 | 4947–49Before Chrome Android 49, only video is supported, not audio. |
25 | 36 | 14 | 5.0 | 4947–49Before WebView Android 49, only video is supported, not audio. |
14 |
pause |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14.5 | 5.0 | 49 | 14.5 |
pause_event |
49 | 79 | 65 | 36 | 14.1 | 49 | 65 | 36 | 14.5 | 5.0 | 49 | 14.5 |
requestData |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14 | 5.0 | 49 | 14 |
resume |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14.5 | 5.0 | 49 | 14.5 |
resume_event |
49 | 79 | 65 | 36 | 14.1 | 49 | 65 | 36 | 14.5 | 5.0 | 49 | 14.5 |
start |
47 | 79 | 25 | 36 | 14.1 | 47 | 25 | 36 | 14 | 5.0 | 47 | 14 |
start_event |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14 | 5.0 | 49 | 14 |
state |
4947–49Before Chrome 49, only video is supported, not audio. |
79 | 25 | 36 | 14.1 | 4947–49Before Chrome Android 49, only video is supported, not audio. |
25 | 36 | 14 | 5.0 | 4947–49Before WebView Android 49, only video is supported, not audio. |
14 |
stop |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14 | 5.0 | 49 | 14 |
stop_event |
49 | 79 | 25 | 36 | 14.1 | 49 | 25 | 36 | 14 | 5.0 | 49 | 14 |
stream |
4947–49Before Chrome 49, only video is supported, not audio. |
79 | 25 | 36 | 14.1 | 4947–49Before Chrome Android 49, only video is supported, not audio. |
25 | 36 | 14 | 5.0 | 49 | 14 |
videoBitsPerSecond |
49 | 79 | 71 | 36 | 14.1 | 49 | 79 | 36 | 14.5 | 5.0 | 49 | 14.5 |
MediaDevices.getUserMedia()
© 2005–2025 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API