Audio Visualizer + GPU Particles
Andres Nirk, Madis Janno, Heiti Ehrpais
This project aims to create sound visualizations using webgl and GPU particles. User can give sound input from microphone, play notes from his keyboard, choose from our song selection or upload his own sounds.
User is able to rotate and zoom in/out in the scene. Particles move in rhythm of the playing song. There are currently 65536 particles created.
Left side of bars represent high frequencies and right side lower frequencies.
User is also able to play notes from his keyboard. Buttons "x" and "z" change octave. Buttons "a","s","d","e","f","g","i","j","k" allow to play different notes.
Music visualisation is a feature found in electronic music visualizers and media player software, generates animated imagery based on a piece of music. The imagery is usually generated and rendered in real time and in a way synchronized with the music as it is played. To make our visualisations we used Tree.js. After getting data from current audio source, we made particles to move or change color accordingly.
What went well
We managed to make good UI for users. We managed to not have any performance issues, that can be common when dealing with many particles and shaders in browser.
In the middle of the project we realised, that there are many different ways we can go with the project. We were not sure, if we wanted to have one scene, where there are many different types of things happening at the same time or multiple scenes like we have now.