W3cubDocs

/Web APIs

BaseAudioContext

The BaseAudioContext interface of the Web Audio API acts as a base definition for online and offline audio-processing graphs, as represented by AudioContext and OfflineAudioContext respectively. You wouldn't use BaseAudioContext directly — you'd use its features via one of these two inheriting interfaces.

A BaseAudioContext can be a target of events, therefore it implements the EventTarget interface.

EventTarget BaseAudioContext

Instance properties

BaseAudioContext.audioWorklet Read only Secure context

Returns the AudioWorklet object, which can be used to create and manage AudioNodes in which JavaScript code implementing the AudioWorkletProcessor interface are run in the background to process audio data.

BaseAudioContext.currentTime Read only

Returns a double representing an ever-increasing hardware time in seconds used for scheduling. It starts at 0.

BaseAudioContext.destination Read only

Returns an AudioDestinationNode representing the final destination of all audio in the context. It can be thought of as the audio-rendering device.

BaseAudioContext.listener Read only

Returns the AudioListener object, used for 3D spatialization.

BaseAudioContext.sampleRate Read only

Returns a float representing the sample rate (in samples per second) used by all nodes in this context. The sample-rate of an AudioContext cannot be changed.

BaseAudioContext.state Read only

Returns the current state of the AudioContext.

Instance methods

Also implements methods from the interface EventTarget.

BaseAudioContext.createAnalyser()

Creates an AnalyserNode, which can be used to expose audio time and frequency data and for example to create data visualizations.

BaseAudioContext.createBiquadFilter()

Creates a BiquadFilterNode, which represents a second order filter configurable as several different common filter types: high-pass, low-pass, band-pass, etc

BaseAudioContext.createBuffer()

Creates a new, empty AudioBuffer object, which can then be populated by data and played via an AudioBufferSourceNode.

BaseAudioContext.createBufferSource()

Creates an AudioBufferSourceNode, which can be used to play and manipulate audio data contained within an AudioBuffer object. AudioBuffers are created using AudioContext.createBuffer() or returned by AudioContext.decodeAudioData() when it successfully decodes an audio track.

BaseAudioContext.createConstantSource()

Creates a ConstantSourceNode object, which is an audio source that continuously outputs a monaural (one-channel) sound signal whose samples all have the same value.

BaseAudioContext.createChannelMerger()

Creates a ChannelMergerNode, which is used to combine channels from multiple audio streams into a single audio stream.

BaseAudioContext.createChannelSplitter()

Creates a ChannelSplitterNode, which is used to access the individual channels of an audio stream and process them separately.

BaseAudioContext.createConvolver()

Creates a ConvolverNode, which can be used to apply convolution effects to your audio graph, for example a reverberation effect.

BaseAudioContext.createDelay()

Creates a DelayNode, which is used to delay the incoming audio signal by a certain amount. This node is also useful to create feedback loops in a Web Audio API graph.

BaseAudioContext.createDynamicsCompressor()

Creates a DynamicsCompressorNode, which can be used to apply acoustic compression to an audio signal.

BaseAudioContext.createGain()

Creates a GainNode, which can be used to control the overall volume of the audio graph.

BaseAudioContext.createIIRFilter()

Creates an IIRFilterNode, which represents a second order filter configurable as several different common filter types.

BaseAudioContext.createOscillator()

Creates an OscillatorNode, a source representing a periodic waveform. It basically generates a tone.

BaseAudioContext.createPanner()

Creates a PannerNode, which is used to spatialize an incoming audio stream in 3D space.

BaseAudioContext.createPeriodicWave()

Creates a PeriodicWave, used to define a periodic waveform that can be used to determine the output of an OscillatorNode.

BaseAudioContext.createScriptProcessor() Deprecated

Creates a ScriptProcessorNode, which can be used for direct audio processing via JavaScript.

BaseAudioContext.createStereoPanner()

Creates a StereoPannerNode, which can be used to apply stereo panning to an audio source.

BaseAudioContext.createWaveShaper()

Creates a WaveShaperNode, which is used to implement non-linear distortion effects.

BaseAudioContext.decodeAudioData()

Asynchronously decodes audio file data contained in an ArrayBuffer. In this case, the ArrayBuffer is usually loaded from an XMLHttpRequest's response attribute after setting the responseType to arraybuffer. This method only works on complete files, not fragments of audio files.

Events

statechange

Fired when the AudioContext's state changes due to the calling of one of the state change methods (AudioContext.suspend, AudioContext.resume, or AudioContext.close).

Examples

Basic audio context declaration:

js

const audioContext = new AudioContext();

Cross browser variant:

js

const AudioContext = window.AudioContext || window.webkitAudioContext;
const audioContext = new AudioContext();

const oscillatorNode = audioContext.createOscillator();
const gainNode = audioContext.createGain();
const finish = audioContext.destination;

Specifications

Browser compatibility

Desktop Mobile
Chrome Edge Firefox Internet Explorer Opera Safari WebView Android Chrome Android Firefox for Android Opera Android Safari on IOS Samsung Internet
BaseAudioContext 56
14–56The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
79
12–79The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
53
25–53The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
No 43
15–43The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
14.1
6–14.1The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
56
≤37–56The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
56
18–56The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
53
25–53The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
43
14–43The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
14.5
6–14.5The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
6.0
1.0–6.0The BaseAudioContext interface itself is not present, but many of the methods are available on the AudioContext and OfflineAudioContext interfaces.
audioWorklet 66 79 76 No 53 14.1 66 66 79 47 14.5 9.0
createAnalyser 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createBiquadFilter 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createBuffer 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createBufferSource 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createChannelMerger 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createChannelSplitter 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createConstantSource 56 79 52 No 43 14.1 56 56 52 43 14.5 6.0
createConvolver 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createDelay 24 12 25 No 15 7 ≤37 25 25 14 7 1.5
createDynamicsCompressor 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createGain 24 12 25 No 15 7 ≤37 25 25 14 7 1.5
createIIRFilter 49 14 50 No 36 14.1 49 49 50 36 14.5 5.0
createOscillator 20 12 25 No 15 6 4.4 25 25 14 6 1.5
createPanner 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
createPeriodicWave 30 12 25 No 17 8 4.4 30 25 18 8 2.0
createScriptProcessor 24 12 25 No 15 7 ≤37 25 25 14 7 1.5
createStereoPanner 41 12 37 No 28 14.1 41 41 37 28 14.5 4.0
createWaveShaper 15 12 25 No 15 6 4.4.3 18 25 14 6 1.0
currentTime 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
decodeAudioData 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
destination 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
listener 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
sampleRate 14 12 25 No 15 6 4.4.3 18 25 14 6 1.0
state 41 14 40 No 28 9 41 41 40 28 9 4.0
statechange_event 41 14 40 No 28 9 41 41 40 28 9 4.0

See also

© 2005–2023 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.
https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext