The Story of NYU Shanghai as told through Mushrooms, brought to you by Guille and Marj

The Story of NYU Shanghai as told through Mushrooms
Collaborators: Marjorie Wang and Guillermo Carrasquerro
Tech: Google Tango, Lenovo Phab, Unity3D, MagicaVoxel, Blender
Design Doc PPT
Update PPT

The premise: NYU Shanghai has collapsed many years ago, in year 2019. You have come to visit the ruins of the university, which is now overgrown with plants and mushrooms. To hear the story of NYU Shanghai, you must plant a mushroom, and wait for it to grow. When the mushroom becomes fully sized, you can experience the story of NYU Shanghai.

The process:
Guillermo and I are both quite terrible at scripting so we wanted a premise that was more heavily focused on the visuals. So, we began by creating mushroom assets with the incredible free software MagicaVoxel. With this simple tool, we were able to easily create the environment, as well as the post-mushroom visuals, in a consistent design scheme.

We wrote a script (dialogue) read by beloved professor Clay Shirky. The script:
Clay: Welcome to the Interactive Media Arts floor. IMA was the former creative hub of NYU Shanghai.

(Instantiate Mushroom House)
My name is Clay and I will be your trip guide today.

IMA was founded as an sister program of the ITP graduate program in NYC. Its primary focus was integrating technology with the arts. IMA was born in 2013 alongside NYU Shanghai’s first-ever class. At the beginning, IMA only had 4 students

(transform 4 to 300)
A department of 4 students quickly grew, with over 300 students taking IMA classes in 2017. In these years, the link between art and technology was shown to be a vital aspect of professional development.

For example this project, developed by one of the very greatest of all IMA students ever, Marjorie Wang, bridged Virtual Reality with Artificial Intelligence. This project was adopted by Google to enhance the interview process.

(mini oasis)
In 2017 the lab for Augmented and Virtual reality was created. This lab has become the place to explore and develop technological advancements in AR and VR, without boundaries or guidelines, as a place for self expression and innovation.

At the peak of IMA’s success the unimaginable happened, the great technological apocalypse descended from heaven and destroyed all electronics. Due to the diseased soul of society and its reliance on technology, in 2019, NYU Shanghai collapsed and vanished.

How the script works: Christian, you already know this but I’m very proud that I scripted everything so I will explain my beautiful C# script. Big thanks to Sean G. Kelly for being there for me, for emotional support and scripting support. Keep in mind that I’m terrible at scripting so there may be unnecessary lines.
With my script, I wanted the player to be able to plant mushrooms. After a certain amount of time, the mushrooms would “grow large enough” and a mushroom house instantiates alongside a Clay 3D model and AudioClip.
To do this, I created public Transforms for every gameObject and AudioClip I wanted to instantiate in certain times. Then, I set booleans for each instantiates. This would allow me to only instantiate one object so there wouldn’t be infinite clones of each object trying to instantiate at each time I set. Then, I created mytime, which is the timer that begins after the player plants the first mushroom. Mytime is += Time.deltaTime. Then, for each time I wanted to instantiate, I set “mytime” greater or equal to the time, and set the instantiate to false. Then, I would instantiate my Transform or AudioClip, and then set the instantiate to true so only one object would be instantiated.
For some mysterious reason, I was unable to destroy the objects, so I just created animations for the objects and set the last two keyframes as non-active.

For the splashscreen, I created three GUI canvases, with a series of 2D images created using MagicaVoxel, to make animated GUIs. I created a 1 voxel tall plane and replaced different colors with each new frame.



Bafang is the last thing I see–Marjorie’s Final Animation

“Bafang is the last thing I see”
The Unity Build
The Idea

Most of my work is integrated into virtual reality. Especially in the past semester, I have found that integrating different technologies results in a more immersive, more compelling experience for the user. Thus, when our class was given our final animation assignment, I knew that I wanted to combine two mediums I enjoy creating in–animation and virtual reality and see what would come out.
My favorite technique of animation is drawing frame-by-frame animation with a computer mouse using Adobe Animate. My artwork generally lacks polish, but retains some sort of charm, which lends well to my preferred style of animation. I wanted to stray away from my usual VR environment design style: low-poly 3D models, brightly colored textures, and relatively detailed to increase user realism. Thus, I decided to use a minimalistic black-and-white, 2D style for my loops and place the animations around the player at differing distances, to give a sense of depth.
The narrative is as follows: I am on the roof of my apartment, 27 stories up. When I look down, I see and hear the city of Shanghai below me. When I look up, I see twinkling stars. Timed with the music, the stars begin to form constellations in the shape of flowers. The flowers become more and more dense, until the sky is filled.

The Process
Movie Texture

To achieve this, I used Unity3D, C#, HTC Vive, Adobe Animate, and Adobe Photoshop.
The technical aspect of the project changed quite a bit, as I found that many techniques I thought would work, looked absolutely horrendous.
To add the animated loops, I converted my video clips into .ogg format, imported them into Unity, and created a movie texture. Then, I added a plane gameObject to the scene, and placed the movie texture on the plane as the material. There is a short script required to play the video, that I attached onto each plane.

My original idea was to create three different scenes in Unity3D, one for the city, one for the stars, and one for the flowers. However, I ran into problems of scene load and audio. When the user is in the headset, and the scene changes, they see a navy loading screen that breaks the immersion. Furthermore, the audio would cut abruptly. To combat this, I put all the animations into a single scene. The animations would be instantiated with a timer.

C# Script
Now that I was only dealing with one scene, I wrote a C# script to manage the instantiation of the different clips. The script did a few jobs. I wanted the user to begin by looking at the city, and then once they turned their head up to look at the stars, or turned around, it would trigger the instantiation of the music and the flowers as synced with the audio. To achieve this, I (with the help of Sean) turned the user into a unicorn. (This is because I’m not good at scripting and I didn’t want to learn how to raycast properly). I parented an elongated rectangular prism onto the user’s HTC Vive headset, and set it as a trigger. Then, I placed a cube above and behind the player. Because the objects were colliders, once the player’s unicorn horn hit either cubes, the timer would begin.
The timer begins to run. At certain intervals of time, transform objects would be instantiated. To ensure that the script did not instantiate objects infinitely at a given time, I created a boolean that would set as true after the object instantiates.

For the city animation, I thought I would be able to take a photo from my rooftop, trace over it, and have a somewhat 3D-looking scene, as the perspective would be correct. Oh boy was I wrong. Once I placed the image onto a plane into Unity, it was clear that the city was a flat image. After a consultation with Tim, we decided to try to take the four foreground buildings out of the image onto separate planes and have them be transparent images. This worked really nicely, as user testers were no longer distracted by the obvious 2D-ness of my city.

Chroma Key
At this point, I was facing a major problem that took three weeks to resolve. My original idea was to use transparent videos for my animated loops so I could give a sense of depth to my stars and to place the flowers on top of the stars without obscuring them. However, Unity3D personal edition does not support transparent videos (the software does support transparent images) so I attempted to chroma key out the black in my videos, leaving just the white, resulting in a transparent video. This proved very difficult, as I was unfamiliar with using shaders in unity. I created a standard shader, put in my chroma key shader script, and proceeded to attach the shader in the wrong menu for three weeks. It was only two days before the project was due, that I realized my error, rendering the rest of the project smooth sailing.

Did I achieve what I sought out to?
Through this process, I learned a lot about Unity, about designing for virtual reality, using layers and animation loops, and C# scripting, which was my goal in creating a 2D, frame-by-frame animation in virtual reality. I wanted to test whether 2D placed in 3D could give a sense of immersive 3D. In the end, feedback that I received give the impression that the project was successful.

What would I do differently?
If I had more time, I would place more stars in the scene, having them further away, drawing more variety of constellations. I would put the stars and flowers more carefully around the player, to minimize overlap errors. I see now why Unity is trying so hard to create a platform for creating VR in VR. It is tedious to constantly move an object, play the scene, and put on the headset to check if it’s placed correctly.
I would also like to focus more on the transitions between the elements on the scene, fading away from the city to the stars, and giving more indication to the user that the narrative is changing.
Next steps for the concept: Playing around with having animations as the textures of 3D modeled objects; removing the unicorn and recording a 360 video in Unity for the animation to be viewable on mobile VR; using the loops I have now to create a traditional animation.

Thanks to:
Lei Ban for the incredible soundtrack.
Sean Kelly for endless technical support.
Tim Szetela for fixing nearly all my non-scripting problems.

Bafang Womansions–Marjorie Wang’s Capstone Project

Bafang Womansions
Bafang Womansions is a virtual reality time capsule of my home life in the last year of university (Fall 2016 to Spring 2017). Virtual Reality is a wonderful way to capture the essence of my apartment, my friends and roommates, and our memories, as the virtual space can be revisited at any time. To document my senior year, I 3D modeled the space and furniture to scale, included as many personal details as time would allow, in the form of objects that represent moments in the past year, and added life-size 3D models of my roommates.
The Problem with the Concept
Last semester, I was working mostly on creating virtual reality experiences and games, with emphasis on improving the ease of user interaction with a virtual, and thus unfamiliar space. In collaboration with Baaria Chaudhary, we created Hyperspace VR, a research into the usage of sound as a trigger to direct user interaction. In our second project, The Last Star System, we incorporated more game-like elements to a space exploration experience, where the user travels to surrealist planets of our imagination, in search for life. At the end of the semester, we had cultivated a good intuition for designing virtual reality worlds, and I was completely comfortable quickly creating low-poly 3D assets in Blender for prototyping in Unity3D, for the HTC Vive.
The problem was, Baaria and I work so well together because I am the 3D modeler, the environment designer, the artist behind the visuals, while she is the programmer behind the C## scripts running the interactions. Thus, I attempted to formulate a project that would allow me to focus purely on the work I love to do, which resulted in concept one. Concept one was creating a series of nature CGI scenes of a single alien world with Cycles render, Blender’s render engine, viewable in the Google Cardboard. I discovered that the most recent update of Blender supported stereo equirectangular rendering, which easily allowed me to preview my renders in virtual reality. However, I began to run into problems when I had capstone meetings with professors and advisors. I kept talking about my passion for 3D modeling, for creating the world, and pushing the user experience to the side. Concept one went through several iterations. One was a puzzle game, where the user would attempt to cross a lake by placing objects as a walkway. Another was an exploratory experience playing with a sense of scale and detail. Yet, no idea stuck to me and I spent the first few weeks of the capstone process doing several Youtube tutorials of rendering natural objects in Cycles.
The change happened when I scrapped my first idea and began 3D modeling my apartment. Prior to this point, I found myself failing to open Blender, as I was uninspired by my concept. After a consultation with capstone advisor, AJ Levine, I was able to conceptualize my new idea, to describe why I enjoyed 3D modeling my apartment so much more than creating a relatively impersonal nature scene. The idea became: to document my senior year, I 3D modeled the space and furniture to scale, included as many personal details as time would allow, in the form of objects that represent moments in the past year, and added life-size 3D models of my roommates. Once I began describing my project in this way (as a VR time capsule, built for my roommates and I) I found myself happily modeling the objects in my apartment to as much detail as I could. Every time I modeled a new object, I considered the stories that it tells, and in discussion of my project with friends, we recounted our different memories surrounding the same object.
The 3D Modeling
To create a VR time capsule, I needed to retain a level of realism in my models. Although I was no longer aiming for photorealism, such as that of CGI, I needed the space to retain the essence of my apartment. To do this, I used several techniques of photorealism in the modeling work. Although I began by eyeballing the dimensions of the space-windows, doors, walls, and furniture-once I began to measure every object I was modeling, it was much easier to set up a space that seemed realistic, no matter the placement of the individual objects. Another technique I used is beveling. In real life, no object can have a perfect edge. Once I began to bevel the larger pieces of furniture, the effect was perhaps imperceptible, but significant. The last technique I tried to use is the importance of added seemingly extraneous details for achieve an overall greater sense of realism. For the apartment to look lived in, I needed to add the ugly AC unit, the paintings on the walls, the books in the shelves, and the empty water bottles. I would say that this aspect of the project gave me most trouble, as it was the most constrained by time. Here is where I will work on, when I continue to expand and improve the project.
2017-04-17 02_04_19-Unity 5.5.0f3 Personal (64bit) - Apartment.unity - marjorie wang capstone - PC,
Settlers of Catan Board, a game we play several times per week.

An overview of the final model, with enough information to give the illusion of a full apartment.

Detail of the models.

The mantle.

Detail of dining room table.

Our incredible oven, which has produced hundreds of cinnamon rolls.

Detail of table.

2017-04-17 02_03_54-Program Manager
Interacting with the different objects in the scene.

2017-04-17 02_01_09-Unity 5.5.0f3 Personal (64bit) - Apartment.unity - marjorie wang capstone - PC,
Photoscanned model of Kate and a photoscanned model of me.
SketchFab Link to an earlier version of the Apartment Model
The Photoscanning
The project allowed me to become much more comfortable with photoscanning. I photoscanned myself and two roommates (out of five people I wanted to include), with the Structure Sensor and the Skanect application on the iPad mini (with invaluable help from Kyle Greenberg). I experimented with scanning using Skanect and scanning using itSeez3D. Whereas Skanect models are lower resolution, they are better equipped to be rigged and animated with Mixamo. Itseez3D models were higher quality and great for static poses.
The Bafang Womansions.

What Changed?
What changed between concept one and concept two? The time capsule allowed me to design interaction for myself and the people closest to me. The way we interact in a virtual version of a space we’ve taken a year to personalize will be far different from the way players would have interacted with my concept one project. With my capstone, I wanted to further explore the medium that I love to create in, 3D modeling, and the capsule allowed me to focus on the process as well as the end product I presented in class. Most importantly, Bafang Womansions became a personal project to remember what I love in a medium that I love.
Tech Overview
I used Blender for 3D modeling, Unity3D to place the scene together and to add interaction, the Structure Sensor+iPad+Skanect+itSeez3D to photoscan the bodies, all developed for the HTC Vive with help from the SteamVR plugin and VRTK.
Successful? Next Steps
The next steps are to continue modeling and placing objects into the virtual space to help the Bafang Womansions to remember our time in Bafang Mansion. I would also love to begin recording my friends and I as we hang out together around our dining room table and place these audio clips into my time capsule, which will give a deeper sense of presence for the eventual player.
I believe that I was successful in finding a project that represents me quite well. However painful the journey of discovery was, I am satisfied with the final concept. As for whether the project itself is successful, only time will tell. The questions that remain unanswered is, will the virtual space allow my roommates and I to remember our senior year? Will we even use it? As Matt pointed out during my presentation, will the technology that the project is hosted become obsolete, and therefore render my project unable to be accessed? My hope is for the five of us, Baaria Chaudhary, Katarzyna Olszewska, Saphya Council, Efae Nicholson, and me, Marjorie Wang to remember the wonderful times we have had, in the past four years of our friendship and to continue to make memories together.
Throughout the capstone process, I began to see the value in the process of ideating a project and choosing to go forward or to go in a different direction. Clay Shirky’s advice during the first week held true throughout: when there are two paths to take, don’t spend time deciding which is better; just try one. For this realization, I would like to thank the entire capstone faculty. Thanks to Owen Byron Roberts for being an incredible professor during the Fall 2016 semester and kickstarting my love of Blender and giving me a Unity3D foundation. I’ve never had a professor go so above and beyond in helping our projects achieve higher potential. Thanks to Kyle Greenberg for inspiring my usage of photoscanned models and sharing his knowledge with me. Thanks to Christian Grewell for creating a space for uninhibited, stupid creativity, and providing the technical support needed to get our projects working. My biggest thanks goes to AJ for being a wonderful capstone advisor, for sitting down with me time and time again to brainstorm with me.

Final Animation Mood Board–Marjorie Wang

The Concept
My final animation is a sound visualisation exploring making a 2D frame-by-frame animation in Virtual Reality. The experience places the user in a dark void with twinkling stars. As the sound changes, the constellations begin to form, creating a world of my imagination which becomes overwhelmingly crowded and dense, resulting in the contraction of the universe by gravity, and subsequently, the Big Bang.
The mood board depicts the change of my concept. At first, I wanted to depict zero gravity, and have the user experience what zero gravity may feel like, even though they are standing in a room with their feet planted on the ground. However, after visiting the Studio Ghibli museum in Tokyo, I revised my concept to incorporate a more whimsical, imaginative experience.
I will be working in Adobe Animate to create transparent animations, which will be imported into Unity and layered together to create a 3D feel. Developed for the HTC Vive.
The Tumblr

Sound Visualization Animation–Marjorie Wang

My sound visualization animation began as an exercise in procrastination. I struggled with the class exercise–visualizing the sounds of others in an abstract manner–and found myself gravitating towards figurative drawings of aliens, UFOs and toilets. My sounds were, keys clanging, pencil tapping, eggshell crushing, Sadako impersonation, and eagle impersonation. I had no clue how to produce a cohesive animation, and was getting quite uninspired with the assignment. However, once I began animating in Adobe Animate CC, I found that by corresponding the sounds with change in graphics, I could essentially make any animation I chose. Thus, in the final piece, there is heavy allusion to 2001: A Space Odyssey, the solar system, Sputnik, the ISS, dragonfruit, body parts, and my favorite, the (hoax?) phenomenon of the double iris. As I animated more and more, the images became a reflection of aspects of my life, my friends, and the things we discover together.


Five Sounds

Five sounds:
Keys clang on the ring of a wallet, some at a higher pitch than others, providing a cacophony of obtrusive sound.
The uncertain pitch of the The Grudge’s Kayako stutters and falls, before it rises again in terrifying power and rising suspense. Elevator music plays in the background, as contrast to the sound so evocative of the Japanese horror genre.
A table creaks with the shuffling of a pencil on paper as background noise.
An egg is crushed with audible force, dryer than the chew of a potato chip, and more aggressive than the crumple of paper.
A rhythm of knocking gives way to the shriek of a bald eagle, evoking an almost, old western cinema influence.

Walk Cycle — Marjorie Wang

For my walk cycle, I used Zotter Chocolate wrappers as my material and inspiration. I used the inner and outer wrappers of chocolates from the Handscooped line, namely Candied Ginger, Apples and Carrots with Ginger, Lemon Curd and Orange Lord, For the Best Mother in the World (Almond Roses), Caipirinha, and Cheese, Walnut and Raisin. I was inspired by the packaging artwork of Andreas H. Gratze, whose fun characters and scenes bring Zotter chocolates to life. I wanted to maintain the same sense of liveliness and movement in my animation. Thus, I made one skip walk cycle, one cartwheel cycle, two muscular men dancing to thriller, falling limes, flying carrot sticks, and a few static characters in dynamic poses.

Pauline Oliveros Response — Marjorie Wang

Some Sound Observations by Pauline Oliveros is an incredible piece. She speaks about audio with such care and respect, presenting sound as constant inspiration. Personally, I find myself talking about tea in the same tone, using similar descriptors, finding the same awe in something that a majority of people around me disregard. To me, drinking tea is like going on a vacation; I’m taken away from whatever I’m doing in the moment, enveloping my sense of smell and taste completely with something almost purely good. When I close my eyes, nothing is lacking.
Oliveros successfully brings out the wonder that she finds in sound, something I’ve been trying to inspire in the casual tea community. The similarities are endless. The nuances of tea can vary from picking day to picking day. For the same tea, let’s say a Huang Shan Mao Feng (a savory, bacon-y, unshaped green tea) a one day difference in picking date means a more flavorful, clean tea versus a less refined, rougher tea. Oliveros brings us to the sound of the bulldozer outside her window, to the way the sound meshes with an airplane drone, and to the way the sound reverberates in her eardrum. When I drink tea, I love to notice the difference the way I brew effects the flavor and aroma. I can brew tea to suit my tastes, to bring out the flavors that I prefer in the tea. I can brew the same tea to suit the tastes of two completely different people. Oliveros manipulated sound to hear only the combination tones that she so loved.
In the past, sound has never been my strong suit, although, a majority of my aversion is due to my lack of interest in developing skills in the area. And yet, I was always frustrated by the common lack of interest in Chinese tea. Some Sound Observations has made me rethink my relationship with sound and audio, forcing me to consider the world as Oliveros does-through sound.