“Bafang is the last thing I see”
The Unity Build
Most of my work is integrated into virtual reality. Especially in the past semester, I have found that integrating different technologies results in a more immersive, more compelling experience for the user. Thus, when our class was given our final animation assignment, I knew that I wanted to combine two mediums I enjoy creating in–animation and virtual reality and see what would come out.
My favorite technique of animation is drawing frame-by-frame animation with a computer mouse using Adobe Animate. My artwork generally lacks polish, but retains some sort of charm, which lends well to my preferred style of animation. I wanted to stray away from my usual VR environment design style: low-poly 3D models, brightly colored textures, and relatively detailed to increase user realism. Thus, I decided to use a minimalistic black-and-white, 2D style for my loops and place the animations around the player at differing distances, to give a sense of depth.
The narrative is as follows: I am on the roof of my apartment, 27 stories up. When I look down, I see and hear the city of Shanghai below me. When I look up, I see twinkling stars. Timed with the music, the stars begin to form constellations in the shape of flowers. The flowers become more and more dense, until the sky is filled.
To achieve this, I used Unity3D, C#, HTC Vive, Adobe Animate, and Adobe Photoshop.
The technical aspect of the project changed quite a bit, as I found that many techniques I thought would work, looked absolutely horrendous.
To add the animated loops, I converted my video clips into .ogg format, imported them into Unity, and created a movie texture. Then, I added a plane gameObject to the scene, and placed the movie texture on the plane as the material. There is a short script required to play the video, that I attached onto each plane.
My original idea was to create three different scenes in Unity3D, one for the city, one for the stars, and one for the flowers. However, I ran into problems of scene load and audio. When the user is in the headset, and the scene changes, they see a navy loading screen that breaks the immersion. Furthermore, the audio would cut abruptly. To combat this, I put all the animations into a single scene. The animations would be instantiated with a timer.
Now that I was only dealing with one scene, I wrote a C# script to manage the instantiation of the different clips. The script did a few jobs. I wanted the user to begin by looking at the city, and then once they turned their head up to look at the stars, or turned around, it would trigger the instantiation of the music and the flowers as synced with the audio. To achieve this, I (with the help of Sean) turned the user into a unicorn. (This is because I’m not good at scripting and I didn’t want to learn how to raycast properly). I parented an elongated rectangular prism onto the user’s HTC Vive headset, and set it as a trigger. Then, I placed a cube above and behind the player. Because the objects were colliders, once the player’s unicorn horn hit either cubes, the timer would begin.
The timer begins to run. At certain intervals of time, transform objects would be instantiated. To ensure that the script did not instantiate objects infinitely at a given time, I created a boolean that would set as true after the object instantiates.
For the city animation, I thought I would be able to take a photo from my rooftop, trace over it, and have a somewhat 3D-looking scene, as the perspective would be correct. Oh boy was I wrong. Once I placed the image onto a plane into Unity, it was clear that the city was a flat image. After a consultation with Tim, we decided to try to take the four foreground buildings out of the image onto separate planes and have them be transparent images. This worked really nicely, as user testers were no longer distracted by the obvious 2D-ness of my city.
At this point, I was facing a major problem that took three weeks to resolve. My original idea was to use transparent videos for my animated loops so I could give a sense of depth to my stars and to place the flowers on top of the stars without obscuring them. However, Unity3D personal edition does not support transparent videos (the software does support transparent images) so I attempted to chroma key out the black in my videos, leaving just the white, resulting in a transparent video. This proved very difficult, as I was unfamiliar with using shaders in unity. I created a standard shader, put in my chroma key shader script, and proceeded to attach the shader in the wrong menu for three weeks. It was only two days before the project was due, that I realized my error, rendering the rest of the project smooth sailing.
Did I achieve what I sought out to?
Through this process, I learned a lot about Unity, about designing for virtual reality, using layers and animation loops, and C# scripting, which was my goal in creating a 2D, frame-by-frame animation in virtual reality. I wanted to test whether 2D placed in 3D could give a sense of immersive 3D. In the end, feedback that I received give the impression that the project was successful.
What would I do differently?
If I had more time, I would place more stars in the scene, having them further away, drawing more variety of constellations. I would put the stars and flowers more carefully around the player, to minimize overlap errors. I see now why Unity is trying so hard to create a platform for creating VR in VR. It is tedious to constantly move an object, play the scene, and put on the headset to check if it’s placed correctly.
I would also like to focus more on the transitions between the elements on the scene, fading away from the city to the stars, and giving more indication to the user that the narrative is changing.
Next steps for the concept: Playing around with having animations as the textures of 3D modeled objects; removing the unicorn and recording a 360 video in Unity for the animation to be viewable on mobile VR; using the loops I have now to create a traditional animation.
Lei Ban for the incredible soundtrack.
Sean Kelly for endless technical support.
Tim Szetela for fixing nearly all my non-scripting problems.