Luminesce is a collaboration between myself and Gian Slater‘s Invenio. It was first performed over two nights in July, 2013 at The Guild Theatre, Melbourne, Australia and it is planned to expand the project for new performances later this year.
Video of the performances below and more details here.
Visualising four channels of monophonic MIDI with VIZZable 2.1
VIZZable has a new home!
For more info and to download VIZZable goto
For the past few weeks I’ve been working with Gian Slater and her choir Invenio to produce a new audio-visual work called Luminesce. The concept for the show is an extension of my Concerto for Light Sculpture piece, seven singers are arranged in a line and are projected onto. Each singers voice controls what exactly is projected on to them and Gian has arranged the music in such a way as to create emergent patterns of light across the singers. The show’s debuting next week at The Guild Theater, Melbourne. If you’re a Melbournite and like pretty music, shiny lights and/or technical technology you really should come!
For the nerds who like to know what’s happening under the hood, read on.
The overall architecture for the show is something like this: each of the singers has a microphone that’s fed into a macbook running Ableton. Each Ableton track has a Max for Live device measuring the inputs’ amplitude, converting it into a float between 0 and 1 and beaming that via OSC over to a windows machine running Derivatives’ Touch Designer .
Luminesce is the first project I’ve used Touch for and coming from Max/MSP I’ve found it super easy to pick up. Each node in Touch lets you see exactly what it’s doing and the environment is text based where it’s better to be. It’s probably most similar to Quartz Composer but more mature, flexible and usable. It also feels nice and scifi, the way you zoom in and out of nodes – very Minority Report. I think I’ll be using it a lot more in the future .
The onscreen UI I’ve built for the show has controls for selecting colour schemes, controlling colour levels and a soft border, as well as meters to see the amplitude of each voice coming in and another meter showing how each voice channel is being processed (boosted, squashed or clamped). UI building seems to be a strong suit of Touch too. It’s almost a cross between Max and Java/Swing with containers and panels but also sliders and meters, the sort of things you need for media. Although, No prebuilt piano keyboards. It is absolutely possible to build that sort of thing in Touch but for audio and midi generation/processing Max still carries the torch.
For Luminesce I have a dozen or so scenes built from reactive geometry and shader effects that I can fade in and out through out the show but other than that everything is driven by the data provided by the singers.
Shredding with a guitar hero axe:
The clip I put together for Agnes Kain a few months ago has just gone live. The clip is for their track “Still Grey” lifted from their new record “Before We Finally Meet”, out November 23rd.
This was a really fun project – I’ve become quite fond of the lonely moon-bot. The whole thing took about 2 months of steady work and it’s my first primarily computer animated music video – quite the time saver but I think I still prefer the imperfections and unpredictability that come from physical techniques.