Thursday, May 29, 2014

JavaScript's Final Frontier - MIDI

JavaScript has had an amazing last few years. Node.JS has taken server-side development by storm. First person shooter games are being built using HTML and JavaScript in the browser. Natural language processing and machine learning are being implemented in minimalist JavaScript libraries. It would seem like there's no area in which JavaScript isn't set blow away preconceptions about what it can't do and become a major player.

There is, however, one area in which JavaScript - or more accurately the web stack and the engines that implement it - has only made a few tentative forays.  For me this represents a final frontier; the one area where JavaScript has yet to show that it can compete with native applications. That frontier is MIDI.

I know what you're probably thinking. Cheesy video game soundtracks on your SoundBlaster sound card. Web pages with blink tags and bad music tracks on autoplay. They represent one use case where MIDI was applied outside of its original intent. MIDI was made for connecting electronic musical instruments, and it is still very much alive and well. From lighting control systems to professional recording studios to GarageBand, MIDI is a key component of arts performance and production. MIDI connects sequencers, hardware, software synthesizers and drum machines to create the music many people listen to everyday. The specification, though aging, shows no signs of going away anytime soon. It's simple and effective and well crafted.

It had to be. Of all applications, music could be the most demanding. That's because in most applications, even realtime ones, the exact timing of event processing is flexible within certain limits. Interactive web applications can tolerate latency on their network connections. 3D video games can scale down their frames per second and still provide a decent user experience. At 30 frames per second, the illusion of continuous motion is approximated. The human ear, on the other hand, is capable of detecting delays as small as 6 milliseconds. For a musican, latency of 20ms between striking a key and hearing a sound, would be a show-stopper. Accurate timing is essential for music performance and production.

There's been a lot of interest and some amazing demos of Web Audio API functionality.  The Web MIDI API, on the other hand, hasn't gotten much support.  Support for Web MIDI has landed in Chrome Canary, but that's it for now.  A few people have begun to look at the possibility of adding support for it in Firefox.  Until the Web MIDI API is widely supported, interested people will have to make due with the JazzSoft midi plugin and Chris Wilson's Web MIDI API shim.

I remain hopeful that support for this API will grow, because it will open up doors for some truly great new creative and artistic initiatives.

No comments:

Productivity and Note-taking

I told a friend of mine that I wasn't really happy with the amount of time that gets taken up by Slack and "communication and sched...