WebAudioAPI First and Higher Order Ambisonic Examples

The examples below illustrate the use of the JS Ambisonics Library:


FOA examples


FOA Panner

An example of a first-order ambisonic encoding of a sound source. The user can control the direction of the source.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, as there is only a single source, it just follows it.


FOA Player

An example of a loading FOA ambisonic recordings, with binaural decoding.
The user can control the rotation of the sound scene.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.


FOA Virtual Microphone

An example of a listening to the output of a virtual microphone, inside a FOA ambisonic recording.
The user can control the look-up direction of the microphone, and its pattern.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.



HOA examples


HOA Panner

An example of a higher-order ambisonic encoding of a sound source.
The example demonstrates the effect of the ambisonic order on the perception of the sound source.
The user can switch between different sets of decoding filters on-the-fly.


HOA Player

An example of a loading HOA ambisonic recordings, with binaural decoding.
The user can control the rotation of the sound scene.
The example demonstrates the effect of the ambisonic order on the perception of the sound source.
The user can switch between different sets of decoding filters on-the-fly.


HOA Virtual Microphone

An example of a listening to the output of a virtual microphone, inside a HOA ambisonic recording.
The user can control the look-up direction of the microphone, and its pattern.
The vector intensity visualizer analyzes the sound-field and shows the strongest sound activity.
In this case, multiple directions are rapidly analyzed according to the sound scene activity.



Mobile Device-tracked Audiovisual Player example


Google-Cardboard-style Spherical Audio/Video player (Android/Chrome only!)

The following example combines spherical video rendering, Google-Carboard style, with head-tracking based on the smartphone/tablet orientation sensors, and respective ambisonic rendering using JSAmbisonics.
The WebGL/THREEJS code for the visuals is based on the tutorial code available here.

The audio/video recording is taken in Helsinki Concert Hall (Musiikkitalo), on a Brahms piece rehearsed by the Sibelius Academy Symphony Orchestra.



SOFA HRTFs integration example


SOFA HRTFs HOA panner

This is the same as the HOA panner example above, but in this example integration with SOFA HRTFs is demonstrated. Instead of loading or switching the decoding filters directly, as in the previous examples, different sets of HRTFs in the SOFA format are loaded and switched, and the decoding filters are generated automatically for the specified order.



WebGL integration examples


WebGL Virtual Microphone visualization

This is the same as the HOA virtual microphone example above, but with an additional visualization of the rotated microphone pattern using WebGL.



Library module tests

Test

(Not intended for mobiles)
Test script that goes through all the objects in the library, initializes them and logs information on the console - useful to see if the browser supports the library.