Developing VR Content

We have developed a tried & tested methods based & iterative approach to constructing narrative journeys over the life cycle of 100's of diverse projects in moving image, each one having a key requirement to convey a message, an idea or both.

We'd sat at our desks for years in front of screens, so were keen to explore new, intuitive ways of working that VR offers. Spatial computing took on a new meaning when we decided to craft PROMENADE with our hands.

To get away from our desks, the mouse & the Wacom pen, we dived head first into Gravity Sketch on the Quest platform to
begin our journey. Gravity Sketch is used by product designers to produce prototypes. We saw an opportunity for blocking out & spatially testing environment design. We used VR controllers as multi-functional 3D tools. It's quite a sensation for the team to extrude objects with volume from virtual thin air.

New Ways of Working

After scanning hundreds of sketchbook images & having Mike's large format lino prints meticulously photographed at high resolution, we jumped into a shared (remote) studio space with our headsets on and controllers at the ready, to trace out Mike's artwork using 3D strokes.

Being together in a room when we were sometimes 100's of kilometres apart was a novel & fulfilling experience. We built the environments in a similar process to theatrical sets, except we were already on stage, able to create & refine elements in real time.

We built 1:1 using a life size artist's mannequin for reference. This allowed us to fine tune the overall scale of the audience experience. A challenging task to replicate in 2D on screen.

As a team, we could scale & move objects around together and discuss which elements made sense of a convincing 'place to be'.

Once all of the environment elements were built, they were exported as OBJ's & then optimised in Blender to make sure our pixel count was maximised in the headset / left headroom in the Unity scene limit. We brought the models, animation, archive video projections, spatial audio, music & dialogue into Unity for programming.

Experimenting with phasing in the environment (model elements) in stages helped buffer the content in without overwhelming the audience & avoiding excessive loading times.

Adding various interactive functionality mechanics helped drive the narrative, giving the audience a sense of agency. Gaze initiation was used as a subtle way to drive story & deliver information.

Touch was used to add a sense of embodiment & agency. A.I was used to rapidly colourise B&W archive photography, helping to create a sense of time & place in the context of the narrative.


click here to find out more about the story, the team & the processes behind the production of PROMENADE

Headset Gravity Sketch view, scene setup

One of Mike's scanned sketchbook pages