You can view a demo of the project from this post here.

I've been working a lot on fairly technical projects lately, and since I've been sick the past few days and thus had some extra free time, I thought I'd switch it up a little and make something fun. I'm a big fan of the Monstercat label, and in particular, their visualizer. Essentially, it's an audio spectrum overlaid on a particle system which adjusts its speed to the audio. It's a lot cooler and less stuffy than I'm making it sound, so check it out for yourself if you get the chance.

Unfortunately, I didn't exactly start from scratch. My project was based on another meant to do essentially the same thing, but I wasn't quite happy with a few aspects of it and thought I might tweak it a little to make it closer to the real deal. The original project included code for loading an audio file and passing it into Javascript's ScriptProcessorNode API. This API essentially enables the program utilizing it to process a song in chunks by providing an array containing the amplitude data for each frequency range. Obviously, this makes render a spectrum a piece of cake with HTML5's canvas element, as the webpage can just draw a rectangle based on the raw amplitude.

The first modifications I made to the original project were to use a bit of JQuery magic to manipulate the size and position of the main content based on the current window, like so:

```
var width = $(document).width() * 0.9;
width -= width % (barWidth + barMargin * 2);
$('#canvas').attr('width', width);
$('.content').css('margin-top', ($(document).height() - $('.content').height()) / 2 - 80);
$('.content').css('margin-left', ($(document).width() - $('.content').width()) / 2 - 52);[/syntax]
```

This resizes the spectrum to be 90% of the current window width and positions its parent approximately in the center of the screen. Very minor in the grand scheme of things, but it makes a considerable aesthetic difference overall. I made a few more changes around the same time, but I'll get to them in a bit.

I still wasn't quite happy with the visualizer, and wanted to make it even more similar to Monstercat's. The solution? Add a particle system! After exactly zero deliberation I made the decision to use three.js, which is essentially a WebGL API for Javascript. This being said, I had no experience with WebGL up until this point, so it was a bit of a learning curve to get any results. But, I eventually was able to create and render a decent-looking particle system.

Once I had actually created the system, I needed to make them move. The Monstercat particles behave by moving positively on the x-axis at a constant speed as well as at a seemingly random speed on the y-axis. They don't appear to move on the z-axis, so that's ignored in terms of velocity. Therefore, my system behaves in approximately the same way. When a particle is created, it is given a constant x-speed of 0.1 and a random y-speed between -0.05 and 0.05. It's also given a random position on all three axes when first created; however, the x-position becomes non-negotiable when it is respawned. This is a looping system, so when a particle moves offscreen (detected via three.js's fulcrum object), it is given a random z-position, and the x- and y-positions are derived from it. The x-position is computed as follows:

`var x = -Math.abs(camera.position.z - z) * Math.tan(toRads(VIEW_ANGLE));`

This code essentially solves for the leg of a right triangle with a line segment connecting the camera and the particle serving as the hypotenuse. The absolute distance between the camera and the particle on the z-axis serves as one leg, and the view angle serves as the angle between said leg and the hypotenuse. Therefore, the other leg (the absolute x-distance between the particle and the camera, or just the particle's x-position since the camera is at x=0) can be found through a simple `tan`

operation. The whole thing is then negated so that the particle is spawned to the left of the camera. This allows it to appear on the edge of the camera's field of view.

The y-value is calculated in a similar way, using nearly the same equation to determine the absolute maximum displacement from the camera. The actual value is then generated randomly within that bound and its inverse, allowing it to remain random but still within the camera's field of view.

Once all this stuff is out of the way, we're left with managing the speed of the particles in relation to the amplitude of the song. After a good deal of tweaking, this is essentially what I came up with:

```
var minBias; // initialized elsewhere
var freqRangeStart; // initialized elsewhere
var freqRangeLength; // initialized elsewhere
var array; // initialized elsewhere, contains amplitude data with each point ranging from 0 to 1
var sectionLength = array.length * (ampAnalysisLength - ampAnalysisStart); // number of elements to analyze
var sum = 0;
for (var i = 0; i < array.length; i++) {
if (i >= array.length * freqRangeStart && i < array.length * (freqRangeStart + freqRangeLength)) {
var bias = (sectionLength / minBias - (i - (array.length * freqRangeStart))) / (sectionLength / minBias);
sum += bias * array[i];
}
}
var velocityModifier = sum / sectionLength * (minBias * 3 / 2);
```

(Note: I had originally attempted to write out a mathematical equation corresponding to this code, but it was a little confusing given all the variables; also, LaTeX hates me.)

Basically, this code analyzes the amplitude data of a certain subsection of the array, defined by `freqRangeStart`

and `freqRangeLength`

, those being decimal numbers between 0 and 1 representing where the section starts and how long it is in relation to the total frequency range. This enables only a certain frequency range to be considered when determining the velocity multiplier. This value is then multiplied by a bias ranging from `1`

to `minBias`

, respectively, where the first data point is incorporated as is and the last is multiplied by the value of `minBias`

. The bias for all values in between follows a linear trend. For example, if the bias is `0.5`

, the bias for a data point halfway through the section is `0.75`

. This essentially allows for lower frequencies to be weighted heavier than higher ones, making the particles more closely follow the pattern of the audio's bass, while not entirely discluding other frequency ranges. Finally, we multiply the mean value for all amplitude points by a constant equal to the midpoint of `minBias`

and `1`

so as to normalize it by accounting for the diminishing weight of values.

Overall, I had a lot of fun with this project. At the very least, it gave me an excuse to play around with WebGL, which was nice for a change after spending the entire previous week writing a Java bytecode decompiler. Anyhow, I'm pretty happy with the overall result, although I may do a bit more tweaking at some point in the future.