Case Study | Animated Short Film
The process behind
the work — Back to
the Moon

Back to the Moon

Client | Google Doodle, Google Spotlight Stories, Google Arts & Culture, Cinémathèque Française and Nexus Studios
Article by: fonicAdmin

Overview

Back to the Moon is the first-ever Virtual Reality (VR), Augmented Reality (AR), and 360-degree interactive Google Doodle. It celebrates the life and artistry of late 19th century French illusionist and film director Georges Méliès. The Emmy-nominated film was a collaboration between Google Doodle, Google Spotlight Stories, Google Arts & Culture, Cinémathèque Française, Nexus Studios and Fonic.

The complexity that Georges Méliès achieved in those early years of filmmaking was incredible. Méliès pioneered numerous technical and narrative film techniques, primarily in the use of special effects and the creation of some of the earliest films in the science fiction genre. ‘Back to the Moon’ pays tribute to these techniques by bringing them to life throughout the film.

Back to the Moon was to be showcased worldwide on the Google homepage for 48 hours, so Google, Nexus and Fonic undertook an ambitious project, which was delivered across multiple platforms:

  • VR Google Spotlight Stories (GSS) (2018)
    6 DoF Virtual Reality Interactive Experience
    Non-Linear Timeline
  • VR Youtube360 (2018)
    First-Order Ambisonics B-format
    Linear Timeline
  • Theatrical Film 5.1 & Stereo 2.0 (2018)
    Traditional Linear Film
  • AR Google Spotlight Stories (2020)
    Augmented Reality Linear Film 

The project followed two different timeline structures: the traditional linear film, and a non-linear interactive experience. The VR YouTube360, Cinematic Film and AR Google Spotlight Stories all followed a strict linear timeline. VR Google Spotlight Stories has a non-linear timeline structure where the viewer becomes the camera operator, editor and director of the film. This article focuses on the audio journey Fonic undertook for the various platforms, from early planning through to the final delivery.

Youtube360 & First-Order Ambisonics B-format

Our focus first turned to the Ambisonics VR YouTube360 version with its strict linear timeline. Back in 2018, YouTube only supported the First-Order Ambisonics B-format. The Ambisonics B-format standard has two conventions: AmbiX and Furse-Malham standard (FuMa). The B-format 4-channel content is identical across both standards but differs by the sequence in which the four channels are arranged. With AmbiX for example, the sequence is arranged, WYZX instead of WXYZ. The 4-channels represent a different directionality in the 360-degree sphere: centre (W), front-back (Y), up-down (Z), and left-right (X), which can be seen in this diagram. YouTube360 content only accepted the AmbiX standard and no additional channels at this point in time.

Planning

The project planning and architecture of the Pro Tools session was crucial. All the various audio formats across the many versions was a challenge to keep consistent, but our detailed planning achieved precise and seamless audio experiences. Pro Tools integrated two different types of VR software, Facebook’s ‘FB360 Spatial Workstation’ and AudioEase ‘360pan Suite’. After meticulous trials, we felt that the 360pan Suite offered a broader array of features in comparison to Facebook’s offering. A feature bundled within 360pan Suite which turned our heads was the powerful 360reverb, which is an ambisonic convolution reverb. The reverb accepts ambisonic input and delivers ambisonic output, and comes with over 50 Impulse Responses recorded with ambisonic microphones. In addition, the suite offers head tracking (bluetooth Head-Tracker HC-06 purchased separately), which can steer the 360monitor plugin. The plugin decodes the 4-channel First-Order Ambisonics mix into 2-channel binaural for playback across headphones or speakers. This enabled us to monitor our ambisonics mix directly within Pro Tools, giving us a sense how the mix would translate and resemble for the end user experience.

Clip Groups

We had early production meetings with Google and Nexus to discuss the filmic intentions and overall feel of the project. We began by spotting the session within Pro Tools using Clip Groups which identified the areas for our sound design. The visual use of Clip Groups enabled us to pinpoint and organise our SFX throughout the whole project and was a vital indicator when delivering all the individual SFX for the VR Google Spotlight Stories.

Objects 

We started off by breaking down the story into Hero / Main Characters, MainAction SFX and Atmospheres. These elements were assigned their own object aux tracks: for example each character including Méliès, the Queen, the Musicians and Evil Méliès. Each character was built up with a dialog, foley and SFX tracks, which enabled us to cluster their specific sounds inside its own object. 

Tracking Objects

The placement of each object within the virtual space was an arduous task but a very important one. Using the AudioEase 360pan suite, the 360pan plugin was inserted onto each object aux track. The plugin created a puck icon inside the Pro Tools video window and this location corresponds to its ambisonics placement. Automation enables the puck icon to glide seamlessly throughout the immersive arena. We discovered by writing the automation (using the puck icon) in half speed playback mode (Pro Tools shortcut – shift+spacebar), enabled us to track the objects more accurately. This video illustrates the 360pan plugin  (top right corner) working within the Pro Tools session, focusing purely on the Hero/ Main Characters 1 (C1 Méliès Main). The video displays the characters SFX track-lay, which the blue puck icon tracks across the lower right video window.

Focused SFX

An area in which the sound design enhanced the immersive feel was the underwater sequence, as this virtual platform gave a unique ability to push and magnify the three dimensional arena. We submerged the listener underwater by treating both the music and SFX together. We carefully constructed the underwater sound design elements, applied a low pass filter across the music score and added a touch of 360reverb to jell them together. To emphasise the water drainage transition, we used the puck to track the draining animation from the ambisonic ceiling to the floor, which felt incredibly immersive. This video displays the MainAction SFX puck icons across the underwater sequence. The ‘SFX 1 Ob…’ blue puck icon tracks the immersion of the fishbowl into the underwater scene. The same puck icon is then used to track the water drainage transition towards the end of the screen. Throughout the video, the ’TRAIN P…’ red puck icon, can also be seen tracking the toy train journey around the stage.

The project planning and architecture of the Pro Tools session was crucial.
All the various audio formats across the many versions was a challenge to
keep consistent, but our detailed planning achieved precise and seamless
audio experiences.

Mixing the Action

We completed a comprehensive SFX track-lay for Back To The Moon. In this instance, more was better than less. The Pro Tools session covered many layers tracking each element in detail throughout the film. This gave us the flexibility to deliver a rich and immersive sound bed for both the YouTube360 and Google Spotlight Stories.

Our focus then turned to the YouTube360 VR mix. With the linear timeline format, this was a tricky one to get right as we needed to guide the listener throughout Méliès’s journey. Visually the VR world was incredibly detailed, so we couldn’t reveal every sound aspect in this delivery. We broke down the story stage into three zones, this early production sketch from Nexus shows them.

1. Main Action   2. Secondary Action   3. Back Action  

The Main Action zone follows the story and Méliès’s journey throughout the film. The Secondary Action focuses on the outer ring, and the Back Action is everything behind the viewers front focus. Our first mix pass was constructed of all the Main Action elements. Each pass thereafter we carefully introduced each Secondary Action, being mindful not to distract the listener and muddying the mix. The toy train is a great example, as this continuously tracks around the stage, but we only introduce/hear it once it has entered the Main Action zone. For all Back Action SFX these were disregarded at this point in production.

Google Spotlight Story Track-Lay

The production then turned to assembling the Google Spotlight Story (GSS). Nexus developed an ambitious and complex structure for Back to the Moon. The GSS was based on a non-linear timeline, allowing the viewer to control and direct their own journey. Break points were inserted throughout the film enabling them to pause the main story. Once the viewer stopped following the main action zone, these breakpoints were triggered and looped. Nexus delivered each separate break point as mov files, which allowed us to create new sound design for these sections. 

The music was carefully composed to adapt to these changeable break points, enabling ins and outs to seamlessly run throughout the music score. These two videos (created by Mark Davies, Project Leader at Nexus) illustrate the complexity within these two instances. The first video shows the linear timeline and the second one demonstrates the long break point. Mark offered some additional context to these videos: 

There’s a complex one at the beginning where there is actually an alternate section of score that would allow the music to continue if a viewer chose that moment to look around. This was all done in musical time within the Google Spotlight software so we didn’t drop a beat.” 

In addition to these break sections, the GSS created a more complex Ambient-Set SFX. These SFX are triggered once the view visually interacts with these specific objects. For example, the jack-in-the-box could be triggered at any point within the story. Two versions were sound designed to cover the changeable ambience for both the room and underwater sequences. 

Music Session & Sennheiser AMBEO 3D VR Microphone

Mathieu Alvado composed an elegant score for Back to the Moon, which was recorded at the legendary AIR Studios London by The London Philharmonic Orchestra. The session was engineered by the highly-acclaimed Geoff Foster, who did an amazing job. Geoff welcomed us to experiment with the use of the Sennheiser AMBEO 3D VR Microphone. The pseudo-alien microphone was placed just over the conductor’s head which captured the incredible and dynamic Lyndhurst Hall live room. We later processed the recordings using the Sennheiser’s dedicated decoder plugin, AMBEO A-B converter, which converts the raw A-Format recordings into B-Format (FuMa or AmbiX). These recordings created a spatial distinction and sonic width to the orchestral bed. Unfortunately due to the musician layout inside the Lyndhurst Hall, the 4-capsule diaphragm Sennheiser microphone wasn’t a viable option within this production.

Music Score

The music score for the YouTube360 version was delivered in a stereo format. Unfortunately the stereo score proved to be a challenge to accurately portray a correct image within the spatialisation of ambisonics. The translation of the upmixing of the stereo score to ambisonic fell short each time. Our first approach was using ‘position blur’ within the Audioease 360pan plugin. This fixed the audio dips slight dropouts when tracked around the full spherical arena, but added frequency phasing issues and imaging within the score. Our next approach was to use Nugen Halo Upmix (with the 3D extension). Overall this improved the frequency balance and imaging but still wasn’t to the liking of the composer or the Google and Nexus team. 

As already mentioned this was the limitation of YouTube360 and the crossroads at which we regrettably had to drop the First-Order Ambisonics. At that time in early 2018, YouTube only supported the First-Order Ambisonics and no additional channels. Subsequently YouTube now offers the First-Order Ambisonics with a Head-Locked Stereo. Using the Stereo Head-Lock on Back to the Moon would have resolved these music score issues. 

At this point within the YouTube360 project we’d completed 95% of the sound design and VR mixing. As our delivery had changed from First-Order Ambisonics to stereo we still continued to keep the Ambisonics Pro Tools session structure in place. This was partly due to the deliveries of the individual objects but also to monitor the ambisonic translations between Fonic and Nexus for the Google Spotlight Stories. 

6DoF Delivery 

Once the YouTube360 was signed off and delivered, our focus turned to the Six Degrees of Freedom (6DoF) Google Spotlight Stories. The 6DoF virtual reality interactive experience allows the listener to move in three rotational and three transitional degrees of freedom. The Google Spotlight Stories had their own bespoke ‘Story Editor’ SDK which created the six-degrees-of-freedom. Mark Davies, headed up the meticulous programming, implementation and compiling of the whole project.

Our Back to the Moon Pro Tools session was set up as 24bit 48kHz WAV files, so we kept this optimum format when we delivered to Mark. The final size of the Google Spotlight Stories VR Package ‘on device build’ was 304MB and the portion dedicated to sound was 162MB, so audio compression was vital. The ‘Story Editor’ SDK used the audio compression .ogg, which compressed a 4MB wav file down to 200KB.

We compiled and delivered just shy of 200 individual audio clips. Each clip had its own time stamp which corresponded to its relevant timeline placement. To allocate this time stamp a continuous frame counter was burnt into each production video file, with the entire film running from 0 to 3,901 frames. Due to the multi-platform delivery across the linear and non-linear timelines, the use of timecode wasn’t a viable option within this production. 

The majority of the audio clips were mono but we delivered a handful of First-Order Ambisonics which the ‘Story Editor’ SDK ingested as B-Format FuMa order. These ambisonic files were assembled from our large complex SFX groups in our Pro Tools session. The consolidation enabled a direct capture, which helped cut down the processing and programming in the ‘Story Editor’ SDK. These are the ambisonic files; fish bowl atmospherics, underwater bubbles, water draining effect, underwater music, birds during dance, falling book (right before the fight sequence), throne fire, and moon impact hit. 

A shared ‘Google Sheets’ document created a seamless workflow between Nexus and Fonic for these deliverables. Mark had the arduous task of implementing and assembling each individual audio file, mapping them to their object/ roles inside the Google Spotlight Stories. Throughout this process a few updates were needed but overall the ambisonic translations from Pro Tools to the ‘Story Editor’ SDK software was pretty much identical. 

5.1 & 2.0 Stereo version  

The theatrical film version is a traditional cut by the director FX Goby. This film was edited and cut within the ‘Story Editor’ SDK, which allowed FX the flexibility to capture any angle or viewing point within the three dimensional stage. For this traditional linear film we stepped away from our ambisonic Pro Tools session, as the ambisonic routing was incompatible. We extracted and reassembled the ambisonic Pro Tools session into our surround sound session, additionally stripping out any unnecessary SFX that didn’t relate to the theatrical cut. As the theatrical release had a larger dynamic range, this allowed us to enhance and bolster some of the larger SFX. With the addition of the LFE Channel, sub sweeteners were used to add depth to the film. Once the panning content was reworked into the surround sound field, we completed and signed off the 5.1 mix and created a 2.0 stereo fold-down.

On 2nd May 2018, Back to the Moon premiered on the Google Homepage for 48 hours and was viewed by billions of people around the world. This groundbreaking production featured across various awards ceremonies, picking up a nomination at the Primetime Emmy Awards for Outstanding Original Interactive Program, winning a Clio Award for Best Animation and The Lovie Award for Best Internet Video – Animation.

Through this project, Fonic broke new ground in the field of multiple format audio production. The seamless integration between AVID Pro Tools and AudioEase ‘360pan Suite’ enabled us to realise our creative ambitions for the audio ofBack to the Moon. While we were unable to offer the end-listener our fully fledged YouTube360 ambisonics mix, we look forward to utilising the introduction of the additional 2-channel stereo headlock with the 4-channel First-Order Ambisonics for future projects. Finally, our dynamic partnership with  Google and Nexus helped us accomplish and deliver a high quality production across the complex platforms that serves as a fitting tribute to the pioneer filmmaker, Georges Méliès. 

 

Credits

Producer

Mariano Melman Carrara

Project Leads

Mark Davies
Dave Hunt

Audio Post Production

Fonic

Dubbing and VR Mixer

JM Finch

Sound Design

Barnaby Templer
JM Finch

Foley Edit

Chris Swaine

Foley Artist

Sue Harding

Sound Assistant

Rory Hunter

Music Composed by

Mathieu Alvado

Performed by

The London Symphony Orchestra

Recorded by

Geoff Foster at Air Studios

Client Testimonial

In the end Fonic created an incredible tapestry of sound that worked equally well on all versions of the film, and brought our story to life in a way that top quality audio design can do. It was a pleasure to collaborate with the team at Fonic who embraced this project with all its complexities. I look forward to the next opportunity to collaborate with them!
Mark Davies, Project Lead

Client Testimonial

I am extremely proud of the result, because I think the sound is definitely a huge part of the experience. It is what truly anchors the world we have created and makes it “real”. It adds to all the sensory journey thanks to a refined work and attention to details. However you experience this unique film, as a video, a VR or a AR experience, the sound has been thought and crafted to make it truly charming and magical.
FX Goby, Director