squelch.audio-context

create-analyser

(create-analyser ctx)
Creates an AnalyserNode, which can be used to expose audio time and frequency
data and for example to create data visualisations.

create-biquad-filter

(create-biquad-filter ctx)
Creates a BiquadFilterNode, which represents a second order filter
configurable as several different common filter types: high-pass, low-pass,
band-pass, etc.

create-buffer

(create-buffer ctx number-of-channels length sample-rate)
Creates a new, empty AudioBuffer object, which can then be populated by data
and played via an AudioBufferSourceNode.

create-buffer-source

(create-buffer-source ctx)
Creates an AudioBufferSourceNode, which can be used to play and manipulate
audio data contained within an AudioBuffer object. AudioBuffers are created
using AudioContext.createBuffer or returned by AudioContext.decodeAudioData
when it successfully decodes an audio track.

create-channel-merger

(create-channel-merger ctx)(create-channel-merger ctx number-of-inputs)
Creates a ChannelMergerNode, which is used to combine channels from multiple
audio streams into a single audio stream.

create-channel-splitter

(create-channel-splitter ctx)(create-channel-splitter ctx number-of-outputs)
Creates a ChannelSplitterNode, which is used to access the individual
channels of an audio stream and process them separately.

create-convolver

(create-convolver ctx)
Creates a ConvolverNode, which can be used to apply convolution effects to
your audio graph, for example a reverberation effect.

create-delay

(create-delay ctx)(create-delay ctx max-delay-time)
Creates a DelayNode, which is used to delay the incoming audio signal by a
certain amount. This node is also useful to create feedback loops in a Web
Audio API graph.

create-dynamics-compressor

(create-dynamics-compressor ctx)

create-gain

(create-gain ctx)
Creates a GainNode, which can be used to control the overall volume of the
audio graph.

create-media-element-source

(create-media-element-source ctx media-element)
Creates a MediaElementAudioSourceNode associated with an HTMLMediaElement.
This can be used to play and manipulate audio from <video> or <audio>
elements.

create-media-stream-destination

(create-media-stream-destination ctx)
Creates a MediaStreamAudioDestinationNode associated with a MediaStream
representing an audio stream which may be stored in a local file or sent to
another computer.

create-media-stream-source

(create-media-stream-source ctx media-stream)
Creates a MediaStreamAudioSourceNode associated with a MediaStream
representing an audio stream which may come from the local computer microphone
or other sources.

create-oscillator

(create-oscillator ctx)
Creates an OscillatorNode, a source representing a periodic waveform. It
basically generates a tone.

create-panner

(create-panner ctx)
Creates a PannerNode, which is used to spatialise an incoming audio stream
in 3D space.

create-periodic-wave

(create-periodic-wave ctx real imag)
Creates a PeriodicWave, used to define a periodic waveform that can be used
to determine the output of an OscillatorNode.

create-script-processor

(create-script-processor ctx)(create-script-processor ctx buffer-size)(create-script-processor ctx buffer-size number-of-input-channels)(create-script-processor ctx buffer-size number-of-input-channels number-of-output-channels)
Creates a ScriptProcessorNode, which can be used for direct audio processing
via JavaScript.

create-wave-shaper

(create-wave-shaper ctx)
Creates a WaveShaperNode, which is used to implement non-linear distortion
effects.

decode-audio-data

(decode-audio-data ctx audio-data)(decode-audio-data ctx audio-data success-fn)(decode-audio-data ctx audio-data success-fn error-fn)
Asynchronously decodes audio file data contained in an ArrayBuffer. In this
case, the ArrayBuffer is usually loaded from an XMLHttpRequest's response
attribute after setting the responseType to arraybuffer. This method only
works on complete files, not fragments of audio files.

get-current-time

(get-current-time ctx)
Read only.
Returns a double representing an ever-increasing hardware time in seconds
used for scheduling. It starts at 0 and cannot be stopped, paused or reset.

get-destination

(get-destination ctx)
Read only.
Returns an AudioDestinationNode representing the final destination of all
audio in the context. It can be thought of as the audio-rendering device.

get-listener

(get-listener ctx)
Read only.
Returns the AudioListener object, used for 3D spatialization.

get-moz-audio-channel-type

(get-moz-audio-channel-type ctx)
Read only.
Used to return the audio channel that the sound playing in an AudioContext
will play in, on a Firefox OS device.

get-sample-rate

(get-sample-rate ctx)
Read only.
Returns a float representing the sample rate (in samples per second) used by
all nodes in this context. The sample-rate of an AudioContext cannot be
changed.

new-audio-context

(new-audio-context)
Create a new audio context