What happens when an orchestra sends its virtual clone on the road and lets punters remix its performance?
Text:/ Mark Davie
The big money is on the road. It’s what everyone’s saying. That albums are just an audio press release for the tour and the t-shirt. But night after night of playing the same repertoire can be a drag. And you can only ever be in one place at a time, severely limiting earning potential. If only there was a way you could multiply yourself, be in more than one place at a time.
The Australian Chamber Orchestra (ACO) and Mod Productions have gone one better. They’ve figured out how to play in multiple places at once, without physically being in any one of them.
The ACO first got in contact with Michela Ledwidge, Director of Mod Productions, after she gave a presentation at an ABC event on her history with re-mixable film and interactive story-telling. It hit the nail on the head for the ACO, which had been researching ways it could expand its audience.
Michela: “I was commissioned to spend a week looking to see if there was an opportunity to leverage the kind of interactive film projects I’d done in the past. Our framework was all about how to get the audience involved in a high-end audiovisual experience where you don’t have to interact, but if you do interact, there’s a real re-mixable potential.”
There were a number of criteria the solution had to satisfy. The obvious one was broadening the reach of an orchestra that can only ever be in one place at one time. But also, the costs of touring an ensemble are enormous, so finding a more cost-effective distribution of its performances was also high up the list. But lastly, the ACO’s audience is generally longer in the tooth, so if there was a means of bringing younger audiences into play, that wouldn’t be sniffed at either.
The result of all the research is the Virtual ACO roadshow. A travelling kit of seven video projectors (two performers per projector), a 7.1 (minimum) surround speaker system, two spec’ed out PCs and an iPad on a stand that lets attendees shine the virtual spotlight on specific performers as well as giving detail about the four-song repertoire from Bach, Grieg, Smalley and Piazzolla.
The numbers are promising. A typical ACO 10-day might ring up costs in excess of $100,000, whereas Michela estimates the total cost of putting together the open-ended Virtual ACO at about $400,000. Mod Productions absorbed a lot of the labour costs because they now own the technological intellectual property. It means Mod can use the staging format for other similar shows in the future.
So far, the Virtual ACO has performed at the Gold Coast Regional Arts Centre and Swan Hill Gallery. But the plan is to keep the touring package out on the road for at least a couple of years, with plenty more venues showing interest in staging the concept. To stage the show, the venue hires the kit off the ACO. And Mod Productions has got it down to a one day bump-in, and three-hour bump out. If it keeps going well, and funding permits, a second kit might be pressed into action, and more repertoire recorded to keep the Virtual ACO ticking over for years to come.
UNDER THE SPOTLIGHT
When the Virtual ACO rolls into town, the performance is projected on the walls of a room with the speaker system distributed between each projection. The trick is being able to precisely overlay the phantom audio image over each player. Michela explains how the system integrates with the projections on the gallery walls: “In the centre of the space is a music stand with an embedded tablet. As each piece starts, a full body image of each player in that piece appears on the tablet screen. By swiping your finger across the screen you toggle, or select one or more of the players to be spot-lit on the walls.
“When a player is spot-lit, their lighting comes up on the main displays and the audio is remixed automatically so the audio from the spot-lit players is brought to the fore. It’s quite a subtle effect because it’s not simply turning the volume from the video up to 100%. It takes into account different weightings depending on what instruments you’re selecting and the number of players. It’s quite a sophisticated, interactive audio patch that is doing the dynamic changes.”
It’s quite a big risk, putting the control of your mix in the hands of laypeople, but Michela has a vision for the future, and re-mixable performances are a big part of growing interaction with audiences. Michela: “The ACO is an adventurous bunch that took a leap of faith to allow its recordings to be remixed by punters. We’ve put parameters around what the remix experience is, but it’s exposing members of the ensemble in ways that they would never otherwise be exposed. So there’s still a degree of bravery required to enter this space and it’s not been the easiest vision to sell. I’m biased, but from my perspective, it’s definitely the next phase.”
THE ACO IN A BOX
Michela: “The show is currently running off two very high spec PCs, each with an RME MADI card and an Nvidia Quadro K5000 4 x HD video card. On the core engine is a program called TouchDesigner, which manages up to 39 different videos’ worth of material and triggers all the audio as required. We built layers of software on top of that for our logic, and the web management is all controlled and managed by a web service.
“On the second machine we’ve got Plogue Bidule with a range of plug-ins for sweetening, acoustic modelling tuned for each venue. The show control messages come to Plogue Bidule via TouchDesigner, and TouchDesigner gets its messages from the audience via a custom-built iPad app.
“There’s a huge amount of traffic going between the audio and video PCs. A MADI connection transfers all the audio, and some additional logic inside Plogue manages all the OSC messages coming from the audience interface to figure out how a particular mix can occur without drowning out the violins with the bass, for instance.”
Simon Lear, the ACO’s head engineer picks up the trail: “Plogue Bidule is like Max/MSP but at a higher level. If you want an eight-channel mixer you don’t have to build it from scratch. Just grab a device and drop it. It’s a visual, modular audio-programming interface. It takes the MADI input, and, based on the conditions, it does various things with it. The real hub of the audio side is a plug-in from Flux called IRCAM Spat – a surround spatialisation and reverb processor that generates the localisation and the reverb in real time. It’s hosted in Bidule as a VST plug-in, and it comes out of the MADI card to an SSL D/A converter and goes straight to the speaker.”
Each time the kit is rolled out in a different gallery or performance venue, the system has to be re-tuned to make sure the audio localisation matches the new positioning of the projections and the players within them. Simon: “The audio is a phantom image between the speakers – it’s a virtual position so we have to tune that localisation to each space and physical setup. At that point I hand over to sound technician, Felix Abrahams, that’s his department. He installs it and does a fantastic job of tuning the system and its localisation.”
The automatic mix is not just a matter of turning one player up and the other down, it fluidly transitions between selections, as well as figuring out what to do when the orchestra gets switched back in. Simon: “If you’re going from a full orchestra to one player spotlight, there’s going to be a natural drop in level which could make it feel less impressive. So when you go from a full mix to a single person, you need a level boost, and then if you add people you still want a boost but less of a one for each player. There’s no compression or limiting going on, it’s just manipulating the levels based on conditions, and all the ambience is generated in real-time. As more people are added, it dynamically adapts – it’s like a matrix of gain settings.”
SPLITTING UP THE BAND
The ability to manipulate individual recorded parts in an orchestra must set off warning bells for the classical engineers out there. It’s probably obvious by now, but the recording process didn’t follow any of the standard orchestral miking techniques – no Decca trees fanning out above the ensemble. The orchestra had to be captured individually for the system to work, and not only that, each performance had to be filmed against a green screen.
It was a conundrum, because orchestral players aren’t used to operating in isolation, and using a click track would be a leap too far in the wrong direction. In the end, the orchestra was set up on a film sound stage. Each player was stationed on a plinth a couple of meters apart with the two main desks of violins and violas facing each other, while the cellos and bass rounded out the horseshoe configuration.
Simon: “The sound stage was a nice, big, open room with quite a bit of absorption built into it, but it didn’t sound like a great hall or studio. Much more than that, it was a real challenge for the players to play physically separated from each other because they’re accustomed to standing next to each other.”
The players weren’t the only ones in an unusual situation. Simon had to figure out how he was going to capture each instrument without sacrificing overall tone and dynamics.
Simon: “I individually miked them with Schoeps MK4 small diaphragm cardioid condensers, and DPA 4061 miniature mics on Shure radio packs, which we really didn’t end up using. They were a bit of an aid in post-production and I just wanted them for a backup. I had a few Beyer M160 hypercardioid ribbon mics on principals, that were more of a friendly frequency response I could use as a reference. But most of what you hear in the installation is just a straight single Schoeps per player.
“I knew from doing tests from a pilot with the quartet that I’d get about 20dB of separation between players. And that worked with the brief because the players didn’t want to be 100% isolated in the final installation and for some of the pieces that wouldn’t work at all anyhow.
“There was a settling period working out how much isolation there would be, because different departments had different views. But it found its own way, and in the end the spill from the mics fell into the groove. It wasn’t too destructive.
“We hired a redundant recording system which was based on API 8MX2 preamps split into dual Tascam X48 [digital multi-track] recorders. And we used a Midas console as a monitoring hub.”
Once the best take was chosen, Lear did a bit of post-production cleanup in Magix Sequoia – getting rid of noises, thumps and clicks – but didn’t use much EQ on the individual instruments. The main EQ’ing happens in set up to attain the right room balance. And because the balance could be mixed on the fly by punters, a lot of the effort was in setting rules for the automix logic.
It’s a strange way to record an orchestra. But allowing punters to participate in a performance by spotlighting players and following along with the score has brought down the barriers for people unlikely to set foot inside a concert hall. But has it replaced the ACO’s touring schedule, is everyone now sitting at home without a job? Simon Lear says the orchestra is going “120%”. Better than ever. The Virtual ACO is just another string to the bow, a way of doubling the coverage of an orchestra already spread thin. Now if only we could all do that.