Tracer: Spatial Sound Composer
VR Composition tool for 4DSOUND System

Fascinated by the intersection of cutting edge technology and music-making, for a long time I’ve been wondering what could VR bring to the process of sound design. How can it enhance the process as a new design tool?

After being invited to participate among the artist residents of 4DSound’s Spatial Sound Institute, together with Gabor Pribek we’ve been working together on a new tool, that explores the relation between the tangible process of drawing and spatial composition on the 4DSound System.

4DS-Inside-Monom4DS-Inside-Monom

What is 4DSOUND?

"4DSOUND is a collective exploring spatial sound as a medium. Since 2007, they have developed an integrated hardware and software system that provides a fully omnidirectional sound environment. In 2015, 4DSOUND founded the Spatial Sound Institute in Budapest, a permanent facility dedicated to the research and development of spatial sound."

In praxis, 4DSOUND system represents a volumetric grid array of omnidirectional speakers. Toghether the grid creates a consistent sonic environment, in which the artists create site-specific sound pieces and performances. Using their dedicated software and hardware,  anyone in the space is able to percieve the sound physically moving in the whole area,  whilst allowing free movement in the space, creating a unique experience from each point in the space. 

Context

After my first-hand experience of the 4DSound system in Berlin in 2013, I was amazed by how immersive the environment is – similarly to the Virtual Reality. Given the opportunity to participate in the residency program, we chose to approach our project differently than most residents – we didn’t create a sound piece, but a design tool, using which we connect the Artist (user) and the 4DS system.

We’ve been particularly interested in what could Mixed Reality  bring to the process of creation on the 4DS and what a visual layer would bring to the whole experience. Our project is therefore a different take on the workflow of the performer.

Instutute-of-Spatial-Sounds-listening-experience-Georg-SchrollInstutute-of-Spatial-Sounds-listening-experience-Georg-Schroll

Note: we used VR for the sake of the stage of the technical development of the headset. Getting rid of the cables and allowing the user to navigate would be a huge plus, but a headset able to deliver the desired user experience is not yet available.

Current workflow

The most common interaction workflow is based on manual writing of animations as XYZ coords / math functions as names of Midi clips in Ableton Live on a track containing the 4D M4L plugin. Most of the artists use Ableton for their live performances. Additionally, an iPad is available to allow the real-time control of the engine.

Performers can also develop their own software and control surfaces to control the 4DEngine directly and that's what we've done too.

 

We observed, that the current workflow lacks the immediate connection of the composer and the space. We wanted the to have a tangible control over the process, so we’ve chosen a concept that almost everyone is familiar with – drawing.

A drawing functions as a visual clue; a missing link between the performer and the sound travelling in space. It serves as an additional anchor, provides a visual feedback and extends the human perception of sound.

 

How it works?

Tracer consists of two parts. A live set, designed in Ableton that provides the sound and VR interface developed in Unity, translating the movement in 3D space to the 4D Engine. Then in VR, users draw paths in 3D, and assigns a sound track directly from their Ableton to a Tracer following the path. Then the artist can control various parameters of the playback.
Tracer therefore presents a somehow extended, immersive interface for a composers planning their composition on the 4DS System. It’s a prototyping tool for the artist, that lets them design the spatial composition using freely drawn paths.

 

Design process

We’ve been working on Tracer for past six months. During that time we’ve iterated on multiple concepts and interaction principles. Here you can see the first alpha version, that we’ve used without 4D System, only with Unity and its built-in audio sources and physical sound emulation (which is incomparable with the actual system). Our aim was to prove the concept of drawing and its link to spatial sound perception. In the first version, you were able only to draw, assign sounds and control speed and playback mode.

 

Interfaces

In the first prototype, we’ve used purely 2D interfaces and the whole interaction was somehow resembling the use of a smartphone or a tablet, always stuck to your secondary hand. Then, we’ve used the laser pointer mechanic on the primary hand to control it. 

We found using a 2D interface limiting and not not precise to control fader, so we progressed towards a spatial, physical interface, that you can move and pin in the physical space. We got rid of multiple screen navigation for assigning the sound, and re-worked faders to resemble actual physical faders that you know from hardware control surfaces

Navigation and interaction

After putting the VR headset on, users are able to draw paths in 3D space. Afterwards, they’re allowed to assign Tracers – sound emitting objects on these paths. Each Tracer then plays a sound the user has assigned to it. Based on user’s position, the sound is perceived as being physically present in the space, relative to the user’s position in the space.

ViewportViewport

We’ve also added more controls over some sonic and spatial parameters. It’s possible to control the volume of the track, its play state, speed of Tracer following the path, play mode of the tracer along the path, along physical dimensions of the sound in space and send the track through an effect chain.

During the design process, we were were looking how the integration of interfaces and spatial navigation is solved in other apps. Those of Tilt Brush and Gravity Sketch feel very natural, intuitive and straight forward. We decided not to reinvent the wheel and include these navigation similar solution in Tracer, making it more intuitive for users already acquainted with VR.

Presentation

We presented Tracer on 2nd of September in the Spatial Sound Institute during a full day dedicated just to Tracer, when public was invited to come and try Tracer on their own. We collected a lot of feedback from the participants mostly regarding UX and potential features.

 

We're working on the documentation of Tracer at the moment. Check back soon for more content!

Credits

Interaction design & UX: Filip Ruisl, Gábor Pribék
Visual design: Filip Ruisl
Code: Gábor Pribék
Sound design, music: Filip Ruisl