We have a boombox that plays our 'tape', and we can adjust the volume and stereo panning, giving us a fairly basic working audio graph. For more information about ArrayBuffers, see this article about XHR2. The AudioDestinationNode interface represents the end destination of an audio source in a given context usually the speakers of your device. The offline-audio-context directory contains a simple example to show how a Web Audio API OfflineAudioContext interface can be used to rapidly process/render audio in the background to create a buffer, which can then be used in any way you please. Let's add another modification node to practice what we've just learnt. Hello Web Audio API Getting Started We will begin without using the library. So what's going on when we do this? The Web Audio API is a high-level JavaScript API for processing and synthesizing audio in web applications. The StereoPannerNode interface represents a simple stereo panner node that can be used to pan an audio stream left or right. Each audio node performs a basic audio operation and is linked with one more other audio nodes to form an audio routing graph. The create-media-stream-destination directory contains a simple example showing how the Web Audio API AudioContext.createMediaStreamDestination() method can be used to output a stream - in this case to a MediaRecorder instance - to output a sinewave to an opus file. To visualize it, we will be making our audio graph look like this: Let's use the constructor method of creating a node this time. You might also have two streams of audio are stored together, such as in a stereo audio clip. This library implements the Web Audio API specification (also know as WAA) on Node.js. See also the guide on background audio processing using AudioWorklet. So applications such as drum machines and sequencers are well within reach. The MediaElementAudioSourceNode interface represents an audio source consisting of an HTML