Imagine attending a live concert where the musicians aren’t even in the performance space. Instead, they’re 500 miles away, playing their hearts out, while a holographic sound image of the music is mapped and then recreated in the performance space for your listening pleasure. It sounds like science fiction, but it’s not, thanks to technology being developed by Austrian acousticians.
For Franz Zotter and Matthias Frank of the Institute of Electronic Music and Acoustics (IEM) in Graz, Austria, it amounts to something akin to the teleportation of a musical instrument. They use large surrounding spheres of microphone arrays to record holographic sound images — a kind of 3D aural fingerprint — of any musical instrument.
Such an image captures not just tone, frequency and other sound properties, but also directionality. Sound engineers can take that image, feed it into a corresponding compact spherical loudspeaker array (dubbed an icosahedron), and recreate the sound exactly, right down to how the sound waves reflect off the walls of the performance space. “The new kind of holographic sound imagery is a key technology used to reproduce a fully convincing experience of a musical instrument within arbitrary rooms it is played in,” Zotter explained.
It also takes into account the posture of the player. This is particularly important for unplugged instruments, which can produce sounds with very different timbres depending on the orientation of the player. That can potentially throw even an experienced musician off his or her game.
For instance, Miles Davis famously used to play his trumpet facing away from the audience (this had more to do with personal preference than anything else). But there was no problem with directionality in Davis’ case, because there was a microphone attached to the bell of the instrument, and a sound engineer would mix the output as it was reinforced through loudspeakers. But his posture might have made a big difference in overall sound quality had he been playing in an unplugged acoustic scenario with no sound reinforcement, according to Zotter.
It certainly makes for a nifty form of acoustical augmented reality.
“We once had a demo concert where a trombone player was playing inside our surrounding microphone sphere in Graz, and the holographic sound image was reproduced live in Paris,” Zotter told Gizmodo via email. That was back in 2010, for a technical audience at the 2nd International Symposium on Ambisonics and Spherical Acoustics.
Their technology is currently being used as a tool in computer music to project music into rooms. It could also be used to improve concert hall design, or to reduce aircraft noise in urban planning. But the most promising near-term application could be in virtual reality and gaming. “I watched a friend of mine playing GTA on PS4 and I was impressed by how many acoustical things were involved,” Zotter said. “Tire noise on different grounds, passing vehicles, birds, and insects.”
Ultimately, this work will help Zotter and Frank build better maps of the directionality of sounds from many different instruments (particularly in those tricky unplugged environments), letting musicians experience how their playing will sound in a virtual version of an unfamiliar room or performance hall. This is a goal that is also being pursued by other research groups, notably the Institute of Technical Acoustics in Aachen, Germany, according to Zotter.
So future trumpet players hoping to emulate Miles Davis could experience how their playing would sound in virtual recreations of all kinds of different performance spaces — odd postures and all.
Images: (top) A surrounding sphere of 64 microphones to record holographic sound images. (bottom) The icosahedron, a compact spherical array of 20 loudspeakers. Both images courtesy of Franz Zotter.