For several months throughout 2013 Virtual Proximity AKA James Annelsey, Tristan Courtney and I would hang out in the Melbourne Town Hall devising new ways of using computers to pump vast amounts of air through 3-storey high steel pipes. My particular role in all of this was to find a convenient way for the two others to send sysex (the nerdiest of MIDI commands) to the organ via Ableton Live. This is not something Live usually likes to do and, compounding the issue, building sized MIDI controlled instruments can be temperamental.
The Melbourne Town Hall Organ is the largest instrument in the southern hemisphere with almost 8 thousand pipes plus bells and drums. The instrument is retrofitted with a MIDI system by SSOS. This system can send and receive note information on separate channels for each manual as well as CCs for controlling swell. The stops are opened and closed using sysex messages which is the complicating factor when working from within Live. The system I ended up devising was a combination of Max for Live devices recreating the stops of each manual which would send sysex messages over to a separate Max patch via UDP/OSC. This patch would then forward the messages to the instrument. In this way we were able to step around Live’s sysex limitations.
The two musicians were both running live, James using his EWI and a live MIDI looping setup to control the four top manuals of the organ and Tristan controlling the pedal manual with a rockband keytar and also running drums and percussion out to the PA. The note information from the two musicians was passed over to myself, also running live and VIZZable where I was able to visualise the note information, mix that with a live camera and send the resulting video out to two projectors, piping through mad mapper and quartz composer along the way.
The biggest hurdle we had to overcome was with the organ getting stuck notes. The organ would stick on a particular note and we’d need to turn it off and on to get it going again, a process that took five to ten minutes. This turned out to be caused by flooding the organ with too many sysex messages in too short a time. By cueing the messages and then emptying the list at a steady rate we were able to open and close stops reliably.
The end result of all of this coercing of old and new technology was a 40 minute improvised audio-visual set intended to bemuse and perplex. If you have a spare 40 minutes you can view the entire set and judge for your self as to wether we were successful.
Luminesce is a collaboration between myself and Gian Slater‘s Invenio. It was first performed over two nights in July, 2013 at The Guild Theatre, Melbourne, Australia and it is planned to expand the project for new performances later this year.
Video of the performances below and more details here.
For the past few weeks I’ve been working with Gian Slater and her choir Invenio to produce a new audio-visual work called Luminesce. The concept for the show is an extension of my Concerto for Light Sculpture piece, seven singers are arranged in a line and are projected onto. Each singers voice controls what exactly is projected on to them and Gian has arranged the music in such a way as to create emergent patterns of light across the singers. The show’s debuting next week at The Guild Theater, Melbourne. If you’re a Melbournite and like pretty music, shiny lights and/or technical technology you really should come!
For the nerds who like to know what’s happening under the hood, read on.
The overall architecture for the show is something like this: each of the singers has a microphone that’s fed into a macbook running Ableton. Each Ableton track has a Max for Live device measuring the inputs’ amplitude, converting it into a float between 0 and 1 and beaming that via OSC over to a windows machine running Derivatives’ Touch Designer .
Luminesce is the first project I’ve used Touch for and coming from Max/MSP I’ve found it super easy to pick up. Each node in Touch lets you see exactly what it’s doing and the environment is text based where it’s better to be. It’s probably most similar to Quartz Composer but more mature, flexible and usable. It also feels nice and scifi, the way you zoom in and out of nodes – very Minority Report. I think I’ll be using it a lot more in the future .
The onscreen UI I’ve built for the show has controls for selecting colour schemes, controlling colour levels and a soft border, as well as meters to see the amplitude of each voice coming in and another meter showing how each voice channel is being processed (boosted, squashed or clamped). UI building seems to be a strong suit of Touch too. It’s almost a cross between Max and Java/Swing with containers and panels but also sliders and meters, the sort of things you need for media. Although, No prebuilt piano keyboards. It is absolutely possible to build that sort of thing in Touch but for audio and midi generation/processing Max still carries the torch.
For Luminesce I have a dozen or so scenes built from reactive geometry and shader effects that I can fade in and out through out the show but other than that everything is driven by the data provided by the singers.
… and we’re back. while I was blog-absent some fun things have happened. VIZZable was the 10th most read story on CDM last year. It’s exciting news for a little set of plugins in the relatively obscure Max for Live video scene and encouraging that others are as interested in using Ableton as an audio-visual environment as I am. Big shout out to the great folk on the Jitter in Max for Live group!
Sound 2 Light
Some documentation of last years Sound 2 Light has arisen. Sound 2 Light was a tech/media arts event held in Hobart where artists are partnered up and given a small period of time to come up with an installation or performance. I was partnered up with venerable techno producer Sam Gregory and we created a 20 minute AV piece revolving around strange hybrid animals all mapped onto four, large rectangular panels. I also put together two smaller installations: “Homunculus II” a teleoperated robot visible at the start of the video and a colourful, kaleidoscopic, audio-reactive projected piece that evolved throughout the night. Watch the video below and see if you can spot such things. You can also listen to the live broadcast that went out on the night.
Last year I was also lucky enough to VJ the Bass Arena at Melbourne’s Stereosonic. It was the longest set I’ve played so far – about 9 hours – I used the opportunity to try out some new masking and feedback techniques as well as the usual fare of Live camera feeds, glitches, datamoshing and rainbow meltiness. Whilst playing for Danish electropop singstress Lucy Love, her manager approached me and asked if there was anything wrong with the video signal. I gestured to my PS3 controller as if to say, nope, it’s OK, It’s supposed to be like that and before I could launch into a deep discussion of how visuals, when tightly synced, can reenforce sound and give it more impact he said “Don’t do that”. I quickly pigeon holed him as a square whom I usually enjoy confusing but to keep him happy I kept the rest of the set less interesting/confusing. Anyhoo, some videos of the day have popped up on youtube. Datsik was particularly awesome.
A few Zeal shows are popping up in Radelaide in the coming weeks as well as some remix projects to get back into the music side of things. I’m building another controller similar to Archaeopteryx but this time for feet:
I’m also exploring Augmented Reality with a view to put together an immersive AR installation at some stage in the future. I hope I’m not alienating all you normal people as this is possibly the geekiest video I’ve uploaded to date:
Alrighty, you’re up to date. I’m back to blogging and tweeting so sit tight, there’s more soon.