TrackPerformer update

I’ve added in some really exciting new features to my TrackPerformer project, as well as three new performances: We Three Kings, Carol of the Bells, and Joy to the World.


It’s now possible to add filters, or effects, into the processing chain. These filters can be applied before any performers (pre-filters) or after all performers (post-filters). At the moment, the included filters are:

  • FPS: Calculates the average framerate across the performance, optionally displaying it in a DOM element.
  • Grid: Draws a grid to the canvas. The grid may be a simple intersection grid (points), or lines. Both the X and Y axes may be independently configured.
  • Pick: Probably the most interesting (and processor-intensive) part of TrackPerformer. The Pick filter will randomly swap a pixel with one of its neighbours. It’s used at full intensity on both We three Kings and Joy to the World, and, when toned down a little for Carol of the Bells, provides a softening, organic effect.

I’ve got some ideas for more filters down the track… The only difficulty is keeping performance acceptable: manipulation of the canvas pixel by pixel is quite slow in current browsers.


There are a couple of new performers, and some minor updates to some of the existing ones. The Oscillator performer, in particular, is rapidly becoming the most flexible and useful of the performers.

  • There’s a new ShimmerGrid performer, which is great for adding texture and movement to the entire canvas. You can see it in action particularly well on Joy to the World.
  • The Swarm performer can now draw its particles as knots (like the SignalTracker), as well as as dots.
  • The Oscillator now has the ability to draw sustained notes, and to increase the longevity of notes. Take a look at Carol of the Bells to see these new options in use.
  • Notes can now be filtered based not only on their pitch, but also their velocity (volume).

There are a couple of other changes here and there, but these are the main ones.

We Three Kings

The three new example tracks are all taken from We Three Kings, my new Christmas remix album. Why not go and have a listen?


TrackPerformer provides a visual stage for your music, using HTML5 canvas and audio. On that stage, performers “play” the instruments in the music visually. In other words, it’s a visualisation system for music, but based on the notation (the abstract) rather than the audio (the manifestation).

Essentially, you take a piece of music, convert it into a format that TrackPerformer understands (JSON), describe how you want it to be performed, and then watch! You can, of course, write your own performers.

Before going any further, let’s see it in action. The music is “Colony”, a new piece that I wrote about a week ago.

Note: You won’t be able to view the performance linked above in Internet Explorer, due to its over-aggressive script-blocking: the scripts served from GitHub have the wrong mime-type, so IE won’t let them run.

Take a look at the project on GitHub to see how it all fits together. TrackPerformer itself resides in the “Source” directory; in “Examples”, you’ll find the performance of Colony; in “Utilities”, there’s a JavaScript macro for Komodo IDE/Edit that will help you to translate copied-and-pasted OpenMPT pattern data into TrackPerformer’s JSON format.

You can find more information on the TrackPerformer wiki, including an outline of the format, and some basic instructions for getting started. I’ll be adding more information to the wiki over the next few days, and I’ll post updates here too.

Let me know what you think!

Paired sort in JavaScript

Sometimes, you’ll have two arrays of associated data, and you’ll need to sort them. You can’t just call sort() on both arrays, because that will potentially break the associations between them. What you need is a way to sort one of the arrays, and shuffle the elements of the second array to match. Here’s a simple MooTools function that does just that (using quicksort):

	pairedSort: function(other, reverse) {
		if (this.length === other.length) {
			for (var i = 0, len = this.length; i < len; i++) {
				var curr = this[i];
				var currOth = other[i];
				var j = i - 1;
				while ((j >= 0) && (this[j] > curr)) {
					this[j + 1] = this[j];
					other[j + 1] = other[j];
				this[j + 1] = curr;
				other[j + 1] = currOth;
		if (reverse) {
		return this;

Using this function is quite straightforward:

var alpha = [3,2,1,6,5,4];
var beta = [1,2,3,4,5,6];
// alpha == [1,2,3,4,5,6]
// beta == [3,2,1,6,5,4]

When would this actually be useful? Let’s say you’ve got an object which maps numeric assessment scores to an alphabetic grade:

var mapping = {
  'A': 80,
  'B': 60,
  'C': 40,
  'D': 20,
  'E': 0

Unfortunately, objects in JavaScript have no intrinsic order on their keys — this is because they’re essentially just hashmaps. What we need to do to be able to make use of these data, then, is create an array of keys, and one of values, and then sort them. The pairedSort() function allows us to do this easily. (In fact, it’s for this exact application that I wrote the method!)

Generative Music

Continuing my HTML5 and canvas experiments, I’ve put together a generative music system. Essentially, a series of particles move across a field, occasionally triggering sounds — the sound triggered depends on their location in the field.

There is, of course, a little bit more to it than that. Under the hood, I’ve got a series of HTML5 Audio objects that are used to provide polyphonic audio using a simple round-robin algorithm (I encoded the audio in OGG, so you’ll need to use an OGG-friendly browser, like Firefox). The particles are much simpler than those in my previous canvas dalliance, in that they don’t swarm, and their motion is more linear.