connect() method of the
AudioNode interface lets you connect one of the node's outputs to a target, which may be either another
AudioNode (thereby directing the sound data to the specified node) or an
AudioParam, so that the node's output data is automatically used to change the value of that parameter over time.
var destinationNode = AudioNode.connect(destination, outputIndex, inputIndex); AudioNode.connect(destination, outputIndex);
AudioParamto which to connect.
AudioNodeto connect to the destination. The index numbers are defined according to the number of output channels (see Audio channels). While you can only connect a given output to a given input once (repeated attempts are ignored), you can connect an output to multiple inputs by calling
connect()repeatedly. This makes fan-out possible. The default value is 0.
AudioNodeto; the default is 0. The index numbers are defined according to the number of input channels (see Audio channels). It is possible to connect an
AudioNode, which in turn connects back to the first
AudioNode, creating a cycle. This is allowed only if there is at least one
DelayNodein the cycle. Otherwise, a
NotSupportedErrorexception is thrown. This parameter is not allowed if the destination is an
If the destination is a node,
connect() returns a reference to the destination
AudioNode object, allowing you to chain multiple
connect() calls. In some browsers, older implementations of this interface return
If the destination is an
inputIndexdoesn't correspond to an existing input or output.
DelayNodes in the cycle to prevent the resulting waveform from getting stuck constructing the same audio frame indefinitely.
The most obvious use of the
connect() method is to direct the audio output from one node into the audio input of another node for further processing. For example, you might send the audio from a
MediaElementAudioSourceNode—that is, the audio from an HTML5 media element such as
<audio>—through a band pass filter implemented using a
BiquadFilterNode to reduce noise before then sending the audio along to the speakers.
This example creates an oscillator, then links it to a gain node, so that the gain node controls the volume of the oscillator node.
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); var oscillator = audioCtx.createOscillator(); var gainNode = audioCtx.createGain(); oscillator.connect(gainNode); gainNode.connect(audioCtx.destination);
var AudioContext = window.AudioContext || window.webkitAudioContext; var audioCtx = new AudioContext(); // create an normal oscillator to make sound var oscillator = audioCtx.createOscillator(); // create a second oscillator that will be used as an LFO (Low-frequency // oscillator), and will control a parameter var lfo = audioCtx.createOscillator(); // set the frequency of the second oscillator to a low number lfo.frequency.value = 2.0; // 2Hz: two oscillations par second // create a gain whose gain AudioParam will be controlled by the LFO var gain = audioCtx.createGain(); // connect the LFO to the gain AudioParam. This means the value of the LFO // will not produce any audio, but will change the value of the gain instead lfo.connect(gain.gain); // connect the oscillator that will produce audio to the gain oscillator.connect(gain); // connect the gain to the destination so we hear sound gain.connect(audioCtx.destination); // start the oscillator that will produce audio oscillator.start(); // start the oscillator that will modify the gain value lfo.start();
It is possible to connect an
AudioNode output to more than one
AudioParam, and more than one AudioNode output to a single
AudioParam, with multiple calls to
connect(). Fan-in and fan-out are therefore supported.
AudioParam will take the rendered audio data from any
AudioNode output connected to it and convert it to mono by down-mixing (if it is not already mono). Next, it will mix it together with any other such outputs, and the intrinsic parameter value (the value the
AudioParam would normally have without any audio connections), including any timeline changes scheduled for the parameter.
Therefore, it is possible to choose the range in which an
AudioParam will change by setting the value of the
AudioParam to the central frequency, and to use a
GainNode between the audio source and the
AudioParam to adjust the range of the
|Web Audio API |
The definition of 'connect() to an AudioNode' in that specification.
|Web Audio API |
The definition of 'connect() to an AudioParam' in that specification.
|Android webview||Chrome for Android||Edge Mobile||Firefox for Android||Opera for Android||iOS Safari||Samsung Internet|
© 2005–2018 Mozilla Developer Network and individual contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.