organ hacking 101

For several months throughout 2013 Virtual Proximity AKA James Annelsey, Tristan Courtney and I would hang out in the Melbourne Town Hall devising new ways of using computers to pump vast amounts of air through 3-storey high steel pipes.  My particular role in all of this was to find a convenient way for the two others to send sysex (the nerdiest of MIDI commands) to the organ via Ableton Live.  This is not something Live usually likes to do and, compounding the issue, building sized MIDI controlled instruments can be temperamental.

The Melbourne Town Hall Organ is the largest instrument in the southern hemisphere with almost 8 thousand pipes plus bells and drums.  The instrument is retrofitted with a MIDI system by SSOS.  This system can send and receive note information on separate channels for each manual as well as CCs for controlling swell.  The stops are opened and closed using sysex messages which is the complicating factor when working from within Live.  The system I ended up devising was a combination of Max for Live devices recreating the stops of each manual which would send sysex messages over to a separate Max patch via UDP/OSC. This patch would then forward the messages to the instrument.  In this way we were able to step around Live’s sysex limitations.

Screen Shot 2014-02-22 at 1.03.31 PM

The two musicians were both running live, James using his EWI and a live MIDI looping setup to control the four top manuals of the organ and Tristan controlling the pedal manual with a rockband keytar and also running drums and percussion out to the PA.  The note information from the two musicians was passed over to myself, also running live and VIZZable where I was able to visualise the note information, mix that with a live camera and send the resulting  video out to two projectors, piping through mad mapper and quartz composer along the way.

 organDiagram

The biggest hurdle we had to overcome was with the organ getting stuck notes.  The organ would stick on a particular note and we’d need to turn it off and on to get it going again, a process that took five to ten minutes.  This turned out to be caused by flooding the organ with too many sysex messages in too short a time.  By cueing the messages and then emptying the list at a steady rate we were able to open and close stops reliably.

The end result of all of this coercing of old and new technology was a 40 minute improvised audio-visual set intended to bemuse and perplex.  If you have a spare 40 minutes you can view the entire set and judge for your self as to wether we were successful.

luminesce.

luminesce poster

For the past few weeks I’ve been working with Gian Slater and her choir Invenio to produce a new audio-visual work called Luminesce.  The concept for the show is an extension of my Concerto for Light Sculpture piece,  seven singers are arranged in a line and are projected onto.  Each singers voice controls what exactly is projected on to them and Gian has arranged the music in such a way as to create emergent patterns of light across the singers.  The show’s debuting next week at The Guild Theater, Melbourne.  If you’re a Melbournite and like pretty music, shiny lights and/or technical technology you really should come!

Book tickets here
Facebook event is here

For the nerds who like to know what’s happening under the hood, read on.

The overall architecture for the show is something like this:  each of the singers has a microphone that’s fed into a macbook running Ableton.  Each Ableton track has a Max for Live device measuring the inputs’ amplitude, converting it into a float between 0 and 1  and beaming that via OSC over to a windows machine running Derivatives’ Touch Designer .

architecture

Luminesce is the first project I’ve used Touch for and coming from Max/MSP I’ve found it super easy to pick up.  Each node in Touch lets you see exactly what it’s doing and the environment is text based where it’s better to be.  It’s probably most similar to Quartz Composer but more mature, flexible and usable.  It also feels nice and scifi, the way you zoom in and out of nodes – very Minority Report.  I think I’ll be using it a lot more in the future .

The onscreen UI I’ve built for the show has controls for selecting colour schemes, controlling colour levels and a soft border, as well as meters to see the amplitude of each voice coming in and another meter showing how each voice channel is being processed (boosted, squashed or clamped).  UI building seems to be a strong suit of Touch too.  It’s almost a cross between Max and Java/Swing with containers and panels but also sliders and meters, the sort of things you need for media.  Although, No prebuilt piano keyboards.  It is absolutely possible to build that sort of thing in Touch but for audio and midi generation/processing Max still carries the torch.

ui

For Luminesce I have a dozen or so scenes built from reactive geometry and shader effects that I can fade in and out through out the show but other than that everything is driven by the data provided by the singers.


Untitled3
Untitled2
Untitled
Untitled6
Untitled5
Untitled4
20130703_142511

 

MIDIpress

i discovered there’s a shortage of keystroke to MIDI converters for mac so I made MIDIpress.  to use it…

  1. Download MIDIpress
  2.  select your keyboard from the menu on the left.  this might take some trial and error – when you find it the yellow button next to the menu will flash when ever you hit a key.  mine’s “Apple Internal Keyboard / Trackpad 4″.
  3. choose your MIDI output settings.  “from midipress 1″ is a safe bet.
  4. make sure your target application (ableton, FL studio etc.) has your chosen midi port active in it’s preferences.
  5. set your velocity and note length.
  6. do whatever! you can control any midi mappable parameters or jam as you type.

here’s the max patch for the nerdlingers