On March 2018 a multidisciplinary team formed by staff from Designworks and Mixt studio as well as students from Victoria University of Wellington and Massey University, gathered to create a VR experience based on the “Value of Design to New Zealand” research document. The experience was going to be powered by Microsoft using Windows Mixed Reality headsets, and to be showcased at Creative Realities, NZ's biggest tech event.
The research was launched in July 2017 by Hon. Steven Joyce Minister of Finance, undertaken by PwC and commissioned by DesignCo. It provided amazing insights, e.g. Design boosts New Zealand economy by $10.1 billion, approximately 4.2% of New Zealand’s GDP.
Design is dynamic and broad, extending to all industries and occupations. Our VR experience aimed to show the huge impact it has on NZ’s growth

Design Stacks Up at Creative Realities

Project goals

• Explore VR as an educational tool, a tool that offers new and exciting ways to learn. This wish takes us to the concept of edutainment, which refers to media designed to educate through entertainment.• To be a learning process for all parties involved, where an established research could be visualised in VR using clever experience design and storytelling.
• To generate high engagement and feedback from users.
• To encourage further investment into VR educational programs in New Zealand; and investment into design from New Zealand companies.
• To produce a great case study showing the power of edutainment.

DSU team

Gameplay​​​​​​​

After a lot of meetings the experience started to take shape. The gameplay is basically divided into three scenes, with an introduction and a conclusion. Here is an overview of the gameplay.
The experience starts with an animated introduction to Design in New Zealand. After the introduction finishes, multiple blocks (representing the GDP contribution of design related activity to each industry) start to fall and stack around the user. When the blocks finish falling we can see the magnitude of each data set. The user is able to get this information through gazing at each data columns.
The user is then presented with a slider. When the user interacts with the slider it represents investment in Design. The user can see this change shown in the data columns increasing in height and the GDP numbers changing.
The second scene deals with the case study of Good Nature, an amazing New Zealand pest control company. The user is presented with an intractable Good Nature trap that can be picked up and thrown. Every time the trap falls to the ground trees start to grow around the user, in addition to this, facts about the company are presented. As the user continues the interaction the space becomes greener, showing the impact Good Nature has on the environment.
The third scene is about All Birds, an extremely successful New Zealand shoe manufacturer. The scene starts by presenting the user with an All Birds shoe. As the user interacts with this shoe, he starts ascending to the clouds. Facts about the company and hundreds of shoes float around the user.
The experience finishes with the user looking down to a New Zealand map from the heights. Finally an animated conclusion about the impact of Design is presented and the credits roll.
Software development 

The journey starts
The journey was exciting, full of learning, and challenging from the first day. My professional background is in Visual Effects, video postproduction and general CGI tasks, however this was my first approach to Virtual Reality and Game Development as an engineer and not an artist. Thus for the first weeks I had to familiarise myself with VR technologies and workflows. I also had to learn tools such as Unity and .NET.
The first couple of weeks were a bit overwhelming, I often felt discouraged and frustrated. I was feeling like I was not making any progress and that the amount of things I had to learn were impossible in the timeframe. I suffered from “Impostor syndrome”, where the person feels like is not qualified for the job.
Around the start of my third week I realised VR development is challenging, and that all industry professionals must experience similar feelings, that I was not alone. So I decided to stop worrying and focus on learning at my pace. I picked up a couple of tutorials from Pluralsight about developing VR applications for Google Cardboard. The tutorials were good and after completing them I was able to deploy my first VR app on my phone.
Mixt Studio provided a supportive environment. I was given complete freedom to experiment with Unity and VR. The staff gave me tips and guidance when I was stuck, yet they also let me figure things by myself.
I had to implement early concepts where I experimented with lighting effects, or visualised the data as points floating on space. presented and justified these concepts to the Design Stacks Up team in general meetings. Being open to communicate complex ideas, and receive feedback from a range of professionals, was the most enriching part of the internship, and one of my main learning goals.
Some starting challenges
• Manage uncertainty and risk; requirements could change with short notice.
• Overall development time was short; the WMR headsets arrived five weeks before showcasing.
• Only two developers with limited VR development experience.

DSU team at early meetings

Technologies 

Unity API
Unity is a multipurpose and cross-platform game engine that supports 2D and 3D graphics and scripting using C#. A game engine is a software development environment designed to build video games; and it can include a rendering engine for 2D or 3D graphics, a physics engine, sound, scripting, animation, artificial intelligence, networking, video, etc.
Independent game developers often use Unity because it’s free, easy to use and compatible with many platforms. Also most VR SDK are developed for Unity, containing sample scenes.
Unity API is extensive, yet I will briefly explain the most critical principles.
• GameObject: Every object in the application is a GameObject; this includes lights, cameras, effects, characters, sounds, etc. It is the base class for all entities in Unity scenes. A GameObject is made of components, which can hold scripts, materials, physics, etc.
• MonoBehaviour: Is the base class from which every Unity script derives. It provides functions like Update, Start, Destroy or Input.
• Time: It is an interface that allows getting time information from Unity. This is handy for multiple things; for this project we used for animation for example.
• Vectors: A structure used to pass around 3D positions. It provides vector’s math operations.
• Transform objects: You can transform objects (move, rotate, scale) through giving your GameObject a rigid body component. You can apply forces to the game object and it will use the physics engine to calculate its transformation over time. For example the object can react to gravity. The other option to transform an object is by using the Transform component of each GameObject. This component holds the position, rotation and scale of the game object, and can be explicitly modified in real time.
• Access components: To access the desired component we use the method GetComponent. Some reasons for accessing the components can be to change materials or disable a script.
• Collisions: They can be used for triggering an event. For example when you want to display a guideline when the user touches a specific object in the game. Collisions provide functions like OnCollisionEnter or on TriggerExit.
• Instantiate​​​​​​​: This allows cloning objects. It is often used with prefabs, which are templates that store a game object with components and properties. When you change a prefab all the other clones change, making this task very convenient.

Unity API UML Diagram

Windows Mixed Reality (WMR)
WMR is a platform that provides holographic and mixed reality experiences with compatible head-mounted displays.
What is unique about the WMR headsets (compared to the Oculus Rift and HTC Vive) is that they feature inside-out tracking, meaning that the 2 cameras on the front detect movement of the headset, thus no external sensors are required.

Design student testing the experience on WMR

In order to develop for WMR we used the MRTK.
Here are some highlights of developing with this toolkit:
• Allows handling various types of input and sending them to any game object being currently gazed at.
• It uses Unity’s default EventSystem.
• It can be easily extended.
• Each input source implements a IInputSource interface. The interface defines events that the input sources can trigger. The input sources rely on InputManager, whose role it is to forward input to the appropriate game objects.
• The interface we used the most was IFocusable which allows to trigger events when the gaze enters or exits
• We used sample scenes, which gave us a DefaultCursor, InputManager and a MixedRealityCamera.

MRTK Input System Diagram 

Software practices

• The experience consisted of 34 classes written from scratch by Ryan Sumner and me. In addition to using the MRTK, we also used plenty of Unity packages for tasks such as playing video, creating text and lighting effects.
• We used “Manager” classes to control, for example, the sounds, the transition between scenes or the spawning of objects. The Manager classes used the Singleton pattern to instantiate the class to only one object. We also used the Single Responsibility principle. For example we had a ShoeSpawnerManager, which was instantiated once when the application first launched. This ShoeSpawnerManager listened to user events and instantiated a shoe game object accordingly. The shoe game object as well had it’s own script attached, which controlled the shoe properties and behaviours.
• We also implemented an “Object Pool” for application performance. All the objects were instantiated when the app started, but then they were disabled and put into a Queue to be used when requested by the Manager classes.
• Another common practice was to use Coroutines. By using them we were able to create code that instead of running until completion before returning, allowed us to pause the execution and return control to Unity, for continuing where it left on the next frame. These special functions allowed us to run multiple actions simultaneously, or to wait specific seconds to trigger events.
• We also wrote a lot of event-based code using delegates. This code allowed us to create complex and dynamic functionality. For example we would only spawn the shoe when the user moved a slider to a certain position. With delegates we were able to raise an event, subscribe to it and spawn the shoe accordingly without any of the classes involved getting to know each other. This approach helps to make a clean and extendible code.
• Using GitHub was critical. We used it as our main project management and communication tool. The workflow consisted on raising issues, assigning a person to solve them, pulling a request and merging into our main branch.

Testing experience with Design students

Engineering process

The engineering process chosen was Rapid Application Development (RAD) for the following reasons:
• The project needed to be able to adapt promptly to unexpected changes.
• The development times were short.
• Our main concern was prototyping and doing plenty of iterations to show to designers and other collaborators.
• While writing the software we also did the planning required for developing the experience.
• We focused on gathering requirements from the designers. The designers used our prototypes to come with new    ideas, thus we required a continuous integration and rapid delivery.

Rapid Application Development diagram

Asset creation

The environments in Unity were created procedurally, yet the Good Nature trap and the All birds shoe were created using a range of 3D techniques.
The main challenge with the Good Nature trap was to retopologize the object in order to be used in a VR environment. The model we originally got was coming from a CAD file with 11 106 333 verts and file size of 172MB. The task was tricky as the mesh had intricate hard surfaces with heaps of detail. To tackle the problem I used the beautiful  Autodesk Maya 2018 Modeling toolkit and techniques such as making the mesh a Live Surface. After a couple days of work I was able to do an exact copy of the trap but with only 4563 verts and 325KB of size. 

GN trap original and clean mesh

Regarding the All Birds shoe the process was unique and exciting. The client never sent a 3D model of the shoe, instead they sent only a real pair of shoes. Due to this I used a process known as photogrammetry. The process consisted in taking hundreds of high-resolution pictures of an object and using specialised software (Reality Capture, in this case) recreate it in 3D space. It was an interesting process that gave amazing results, however I also had to do the mesh cleanup process and project the normal map into the clean mesh which once again was done in Maya.

All Birds mesh from Reality Capture and clean mesh

UX

Good UX was critical to the success of the experience.
• We had to make a friendly and intuitive experience.
• Most people are completely new to VR. Without proper guidelines users can easily get lost, not knowing what to do next and making the experience very frustrating.
• For every action there has to be a user feedback, for example a specific sound should play when the user is doing the right thing.
• Designing for VR requires new ways of thinking about space, dimension, immersion, interaction, and navigation.
• Depth cues must be used. For example monocular depth, motion parallax and curvilinear perspective.
• Studies have found that placing GUIs 1 meter away from the user is the most comfortable.
• The UX should let the user define the session duration.
• VR is still an uncharted medium. 
• UX is an ongoing challenge. Its full potential can be only unlocked by artist and developers continuous experimentation.
Outcomes

“Design stacks up” was an amazing first approach into VR development.
I had plenty of challenges throughout my journey. Yet the practical nature of the course had me already prepared to do well in a real work environment. The classes regarding the industry projects and software methodologies were particularly useful for me.
Technically the internship was an intensive C# and Game Development course. My coding skills improved greatly and I also found out the areas where I need improvement.
What I found most enriching was collaborating with a range of professionals, on real scenarios and in a creative environment. I developed my communication skills. Often I had to explain complex ideas in meetings, receive and apply feedback, or have technical discussions with fellow developers. Furthermore I also learned to be less stressed and manage uncertainty, adapting promptly to a changing environment
The project was showcased successfully on Creative Realities event and the audience feedback was positive, generating amazing reactions from the public.
I am looking forward for my next steps into VR development. I am particularly interested in the creation of interactive video; and this internship was an amazing foundation to continue my learning journey.
The DSU team

Gemma Hoskins — Project Manager
Designworks
Kate Mathews, Account Director
Nick Hughes — Technical Director
Tasmin Fraser — Designer
Massey University
Anna Brown — Director, Partnership and Projects, College of Creative Arts, 
Brian Lucid — Head of School of Design, College of Creative Arts
James Weeks — Experience designer, School of Design
Mitchel Kirk — Sound designer, School of Music
Oscar Keys — Experience designer, School of Design
Radek Rudnicki — Lecturer and Major Coordinator of Music Practice
Mixt
Jessica Manins — Producer, CEO Mixt
Mauricio Hernandez — Developer & CG artist, Mixt
Ryan Sumner — Developer, Mixt
Taylor Carrasco — CTO, Mixt
Back to Top