The Story of NYU Shanghai as told through Mushrooms, brought to you by Guille and Marj

The Story of NYU Shanghai as told through Mushrooms
Collaborators: Marjorie Wang and Guillermo Carrasquerro
Tech: Google Tango, Lenovo Phab, Unity3D, MagicaVoxel, Blender
Design Doc PPT
Update PPT

The premise: NYU Shanghai has collapsed many years ago, in year 2019. You have come to visit the ruins of the university, which is now overgrown with plants and mushrooms. To hear the story of NYU Shanghai, you must plant a mushroom, and wait for it to grow. When the mushroom becomes fully sized, you can experience the story of NYU Shanghai.

mushone
The process:
Guillermo and I are both quite terrible at scripting so we wanted a premise that was more heavily focused on the visuals. So, we began by creating mushroom assets with the incredible free software MagicaVoxel. With this simple tool, we were able to easily create the environment, as well as the post-mushroom visuals, in a consistent design scheme.

We wrote a script (dialogue) read by beloved professor Clay Shirky. The script:
Clay: Welcome to the Interactive Media Arts floor. IMA was the former creative hub of NYU Shanghai.

(Instantiate Mushroom House)
mushroomhouse
My name is Clay and I will be your trip guide today.

IMA was founded as an sister program of the ITP graduate program in NYC. Its primary focus was integrating technology with the arts. IMA was born in 2013 alongside NYU Shanghai’s first-ever class. At the beginning, IMA only had 4 students

(transform 4 to 300)
A department of 4 students quickly grew, with over 300 students taking IMA classes in 2017. In these years, the link between art and technology was shown to be a vital aspect of professional development.

(visual)
For example this project, developed by one of the very greatest of all IMA students ever, Marjorie Wang, bridged Virtual Reality with Artificial Intelligence. This project was adopted by Google to enhance the interview process.

(mini oasis)
In 2017 the lab for Augmented and Virtual reality was created. This lab has become the place to explore and develop technological advancements in AR and VR, without boundaries or guidelines, as a place for self expression and innovation.

At the peak of IMA’s success the unimaginable happened, the great technological apocalypse descended from heaven and destroyed all electronics. Due to the diseased soul of society and its reliance on technology, in 2019, NYU Shanghai collapsed and vanished.

How the script works: Christian, you already know this but I’m very proud that I scripted everything so I will explain my beautiful C# script. Big thanks to Sean G. Kelly for being there for me, for emotional support and scripting support. Keep in mind that I’m terrible at scripting so there may be unnecessary lines.
With my script, I wanted the player to be able to plant mushrooms. After a certain amount of time, the mushrooms would “grow large enough” and a mushroom house instantiates alongside a Clay 3D model and AudioClip.
To do this, I created public Transforms for every gameObject and AudioClip I wanted to instantiate in certain times. Then, I set booleans for each instantiates. This would allow me to only instantiate one object so there wouldn’t be infinite clones of each object trying to instantiate at each time I set. Then, I created mytime, which is the timer that begins after the player plants the first mushroom. Mytime is += Time.deltaTime. Then, for each time I wanted to instantiate, I set “mytime” greater or equal to the time, and set the instantiate to false. Then, I would instantiate my Transform or AudioClip, and then set the instantiate to true so only one object would be instantiated.
For some mysterious reason, I was unable to destroy the objects, so I just created animations for the objects and set the last two keyframes as non-active.

For the splashscreen, I created three GUI canvases, with a series of 2D images created using MagicaVoxel, to make animated GUIs. I created a 1 voxel tall plane and replaced different colors with each new frame.
guione

guithree

guitwo

Augmented Reality Storytelling by Tyler Rhorick

Reading Response

http://ima.nyu.sh/documentation/2017/02/15/mixed-reality-story-and-response-by-tyler-rhorick/

Blippar Intervention

http://ima.nyu.sh/documentation/2017/02/22/i-am-limitless-animation-ar-tyler-rhorick/

Live Broadcast AR

For the Live Broadcast AR assignment, I was part of the team that tried to augment the IMA equipment room to tell the story of a student who was murdered by a cat for turning in equipment late.

Personal Reflection– Overall, the process of converting the space to tell our story with the green screen went pretty well. I would say that our biggest challenge in creating accurate scale, perspective, and lighting. As for the scale and perspective, we were able to achieve a believable enough positioning of the “victim” student after moving the camera angle and Diana multiple times, but the lighting was one thing we could never remedy. I think this means that for the future we should pay better attention to lighting conditions to give our final image a better overall effect. I think we could have figured it out if given more time, however, so I am not too sad walking away from this assignment.

Your Photogrammetry

Photogrammetry remained to be one of the most difficult assignments of the semester for me, for reasons I still cannot understand. What I was trying to do in this project was create a 3D model of the meowspace to further our story we created in the Live Broadcast, but this proved to be more difficult than anticipated because of the following challenges:

    • The real MeowSpace couldn’t be used– Because the meowspace was under modification when this project was assigned, my original plan failed. I was lucky to find a 3D model of meowspace in the lab that I ended up using, but this did cause some slight panic in the beginning.
  • Creating an accurate scan- The biggest problem with creating a photogrammetry model persisted to be difficulty in capturing images that could successfully be used by the program. I think I had a difficult time because the image I was trying to scan was pretty uniform in texture and lighting was hard to control against the surface of the structure.
  • Software- Another big problem that I had was in using the software. Even if the pictures were bad I could never figure out why the program never showed me a model after following the steps in the tech template. I had shown this problem to Christian, but we still couldn’t figure out what was happening.

Here is a folder containing all of the countless cat pictures I took trying to do this assignment.

Your Game Design Document

Here is our Game Design powerpoint.

Your Core Mechanic Documentation

Here is our Core Mechanic powerpoint.

Your MR interview with Storyboard and Scan

Because there was a misunderstanding when the groups were making their way into the green screen room, Matuez and me got split from our larger group where we had made a storyboard to play off the idea of the Sims. Because of this, we had to make a new model and storyline on the spot. To make the figure, we chose Adobe Fuse because it was quick and simple. We decided to make Vladimir Putin wearing makeup because of the recent ban on such imagery in Russia. As for the interview, it was decided that I would interview Matuez acting like Vladimir about such topics like Russia , the Ukraine, and his makeup.

Here you can see the video and Matuez’s perspective of the experience.

Immersive Sound

For the immersive sound project I decided to use Unity because my Max MSP trial gave out. To do this in unity I watched several tutorials on youtube. Basically the lesson of these tutorials was basically that you need to turn on 3D sound by changing the spacial blend. Using this technique I created a simple player and audio track of the NYU Shanghai alma matter. This player walks around the scene and the sounds get fainter as the player walks away. Here is a screen shot to see what I changed in the audio part to make it 3D.

Screen Shot 2017-05-24 at 9.49.18 PM

Final Project

Project Title: Shanghai Storylines

Partner: Mateuz

Elevator Brief: Shanghai Storylines is an Augmented Reality history experience that communicates the history of Shanghai’s Pudong area. Using their phone, the user can explore the Pudong area of Shanghai and learn more about the history that has often been untold of the area. The project imagines what Shanghai would have looked like from the NYU Shanghai academic building throughout history.

Extended Description: Shanghai Storylines is made technically possible using Unity and Vuforia. To start the experience the user walks up to an old Shanghai style window. Upon scanning the Vuforia marker, the user is introduced to the experience. The first view the user is made privy to is our imaged view of Shanghai from the NYU Shanghai Academic Building in the early 1900’s. The landscape was made in Unity.

Technology: Unity, Smart Phone, Vuforia

Development: Before we could start the project, we had to do a great deal of research concerning the history of Shanghai. To start this research, I met with Anna Greenspan, as a professor that has focused on the urbanization of China. She shared with me very interested texts talking about the historic foliage of Shanghai, which was research we used in the final project when we selected to make all of the trees broadleaf evergreen models, in accordance with Shanghai’s historic ecosystem. After this research was done, we needed a better idea of what was actually built in Pudong. Though we had the idea that it was just going to be fishing villages based on widespread “knowledge,” we still decided to research to make sure that was the correct narrative.

The first lead we got on the prior history of Shanghai came from finding a map on google images of the old Shanghai area. After researching more about this map, it showed that this map was one of very few of the area at the time and is widely considered one of the most reliable models of the area at the time.  Here is that map below:

Shanghai 1945

This map proved to be monumental in moving forward because it gave us the information that Pudong was not always called Pudong, but rather was formerly called Pootung. This information helped us find much more information about the area because this is what scholars have always referred to the area as. In searching for Pootung, we came across one book by Paul French called Old Shanghai. With a very detailed description of the Pootung area, we decided to dedicate our project to his research, which included a very colorful history of old Shanghai that included a foreign cemetery for those that died at sea, foreign occupation of a land controlled by the Chinese government, and animal warehouses that doubled in the nightly trade of prostitutes.

After we finished all of the research which took up most of our time in the first weeks to make sure that we were telling a compelling narrative, we began working on the technical side of the project. In hindsight, we should have started this part of the project way sooner because Matuez and me both had no experience in Unity nor 3D modeling.

Because of this we decided to split up the Unity work. I decided to work on getting the core mechanism of Vuforia working, while Matuez worked figuring out how to get the landscapes started. When it came to getting Vuforia working, we first decided that we wanted to markerless markers, but this proved to be more difficult than we anticipated, so we went back to using a marker. I also worked on getting the core mechanism of buttons and text boxes working so that we could communicate the story of Shanghai. While I was doing this Matuez was learning how to make terrains in Unity. He sent me a working model with the terrains started and then I watched the same tutorials to finish up the models. To modify what he gave me I decided to shape the terrain actually like that of Pudong. He had given me a square terrain, but I decided to be truer to the history we should try and get the right shape to the terrain. To do this I made a plane layer of maps from new and old and sculpted the landscape. Here is how that process looked.

Screen Shot 2017-05-16 at 1.03.41 AM

 

I also decided to add water to the scene, which I also attached a script to make the water move. In addition to this I flushed out some of the areas of the experience to give it a better sense of history like the docks, graveyard, and factory part. Here are some screen grabs of the finalized look of some of these areas.

Screen Shot 2017-05-16 at 3.46.47 AM

Screen Shot 2017-05-15 at 5.39.47 AM Screen Shot 2017-05-15 at 5.39.40 AM Screen Shot 2017-05-15 at 5.40.29 AM

All of the assets were made from other elements of the free asset store. For instance, I made docks out of extremely large and distorted pallets from a warehouse collection on unity.

In the end, our mechanisms definitely worked and I think we gave a great history of the region with the time we had. For the future, I would like to expand the historical content of the project and work to make the buttons and menus feel more integrated in the experience.

Here is a video of the mechanisms working:

The Immersive Soundscape of HATCH/宝贝 (Nicole + Saphya)

Finding Sound

Screenshot (453)

“Coins 1” by ProjectsU012 aka Happy Noise

“8-Bit Wrong 2” by TheDweebMan aka Wrong Noise

“8bit-harmony-lowcutoff-envelope” by DirtyJewbs aka Theme Music

We stuck with chiptune and 8-bit audio, to go with the oldie feel of our game.

In-game Music

Screenshot (452)

  1. Sad Noise: this sound is activated whenever a heart is removed

2. Happy Noise: this sound is activated every time the plater satisfies the qilin by scanning the Vuforia markers, giving the qilin pizza and/or coffee

3. Theme Music: this sound is constant and plays on awake on loop

 

Google Tango Spatial Sound Experiment

Using the same audio, Sean helped us make a Unity scene for the Tango in which users can walk around in a 3D soundscape and approach orbs that had an AudioSource attached. You would hear the sound clearer the closer you got to the orb and fainter the farther you were.

image

InnerChaos – Mixed Reality Storytelling Final Project – Zeyao, Shirley and Collin

Project Name:

Inner Chaos

Description:

Inner Chaos is an Augmented Reality IOS game developed by Zeyao, Shirley and Collin. The player uses the phone camera to scan the common object to equip your backpack, then uses the equipment in the backpack to fight the boss in the inner world. The player also needs to find the key item in our school so that they can get into the inner world to save the school.

Project Demo Video:

 

Project detail:

Screen Shot 2017-05-24 at 3.45.20 PM

Screen Shot 2017-05-24 at 3.45.26 PM

Screen Shot 2017-05-24 at 3.45.37 PM

Screen Shot 2017-05-25 at 12.30.30 AM

Screen Shot 2017-05-25 at 12.30.41 AM

Screen Shot 2017-05-25 at 12.30.58 AM

 

HW10 Immersive Soundscape by David, Reine and Diana

IMA side doesn’t allow me to upload the AIFF format so I just upload it to google drive.

Link for the sound: https://drive.google.com/a/nyu.edu/file/d/0BzZ6RMX2hG5HbWdXbW1FeS13Smc/view?usp=sharing

The sound is used for the beginning of our final project. First, people will hear the sound of fire. Suddenly people will hear a bird sound. Then there will be an explosion. Finally, people will hear some chicken sounds. We want to create a scene that a phoenix wants to wake up from the fire, but something wrong happens and he becomes a chicken. I adjusted the height and the distance of the sound in Max, but it became less obvious after I recorded them in Audacity.

Spacial Sound Documentation – Zeyao, Shirley and Collin

For the last tech template, we used Max 7, AudioHijack and Adobe Audition to make a spacial sound. We intended to create an environment of a public classroom. Imagine you are in that classroom, in front of you there is a student who is struggling with homework so that he is wandering around. Then behind you there are a bunch of people who are yelling and chilling out. Some people got shock by something so they are screaming. At the end, the main character is mad so he screamed. The purpose of our spacial sound is to let the audience experience the chaotic environment.

WechatIMG47

WechatIMG49

 

Link: https://drive.google.com/drive/folders/0B9t9c61LjFhkTHctOURMZXEyNjA?usp=sharing

HATCH/宝贝 Core Mechanic (Nicole + Saphya)

Screenshot (451)

The Basics:

The foundation of our app is controlled by four scripts: Time Management, Heart System, Camera Button, and Tracking Event Handler.

Time Management controls the active states of the hearts in the GUI. Using an array, this script will set the active state of each heart in the array to false in thirty second intervals through communicating with a function in the HeartSystem script called HeartDeletion().

https://github.com/saphya-council/hatch-spr17/blob/master/timemanagement.cs

Camera Button is connected to the lower button in the GUI that activates the ARCamera. This script is attached to both the coffee button and the pizza button and takes several parameters associated with the button identity to pass into the Tracking Event Handler script. If OnTrackingFound() returns true for the specified image target, then HeartAddition() from the HeartSystem script will initialize.

https://github.com/saphya-council/hatch-spr17/blob/master/camerabutton.cs

Tracking Event Handler cross compares what is being scanned in the ARCamera with the image target the CameraButton script has sent it. This is to ensure that the camera doesn’t deinitialize after scanning just any image target in the database, and to make sure that the qilin receives the correct  icon popup in the game.

https://github.com/saphya-council/hatch-spr17/blob/master/trackingeventhandler.cs

Heart System is connected to both the CameraButton and Time Management scripts, and controls the qilin’s animations and appearance. If the qilins hearts reach zero, the qilin will turn into a pile of bones, otherwise it will be a cute qilin.

https://github.com/saphya-council/hatch-spr17/blob/master/heartsystem.cs

Future Plans:

We tried to incorporate multiplayer compatibility so that couples can chat and take care of the same qilin. We first did this through the Unity Multiplayer Networking tutorial, however it did not work between PC and mobile phone. This is because there was no server to connect the two devices. Next, we explored Photon Unity after getting advice from Sean. This was better, because Photon uses a cloud that can be accessed just as an API is used. In the short time that we had, we prioritized perfecting the Vuforia camera and the appearance of our app over the networking component. In the future, we hope to finish our work on that part of the app.

HATCH/宝贝 Final Doc Post (Saphya + Nicole)

PROMOTIONAL VIDEO:

 

WHAT IS HATCH?

Game Design Document_slide1

宝贝/hatch is an augmented reality dating game that encourages “real world” interaction.

Game Design Document_slide2

We realized that the problem with the culture of online dating is that people are less likely to meet in person, which is an issue we hope to rectify through our app.

Game Design Document_slide3

A combination of Neopets and Tinder, users will match with each other based on common interests in their profiles, thus enacting an event where they must find and foster a pet.

Game Design Document_slide4

These rules are put in place to establish a serious minded relationship between two people. We want users to use our app as a means for physical interaction, not to prolong virtual discourse.

Game Design Document_slide5

Users can obtain resources in different places within the academic building to raise their pet. In the cafeteria one can scan for food; in the cafe one can scan for treats; and at any water cooler one can scan for water. The bare necessities. Users are also urged to visit their pet together every now and then to give their pet a love boost.

Game Design Document_slide6 Game Design Document_slide7

 

HOW DOES IT WORK?

Screenshot (451)

The foundation of our app is controlled by four scripts: Time Management, Heart System, Camera Button, and Tracking Event Handler.

Time Management controls the active states of the hearts in the GUI. Using an array, this script will set the active state of each heart in the array to false in thirty second intervals through communicating with a function in the HeartSystem script called HeartDeletion().

https://github.com/saphya-council/hatch-spr17/blob/master/timemanagement.cs

Camera Button is connected to the lower button in the GUI that activates the ARCamera. This script is attached to both the coffee button and the pizza button and takes several parameters associated with the button identity to pass into the Tracking Event Handler script. If OnTrackingFound() returns true for the specified image target, then HeartAddition() from the HeartSystem script will initialize.

https://github.com/saphya-council/hatch-spr17/blob/master/camerabutton.cs

Tracking Event Handler cross compares what is being scanned in the ARCamera with the image target the CameraButton script has sent it. This is to ensure that the camera doesn’t deinitialize after scanning just any image target in the database, and to make sure that the qilin receives the correct  icon popup in the game.

https://github.com/saphya-council/hatch-spr17/blob/master/trackingeventhandler.cs

Heart System is connected to both the CameraButton and Time Management scripts, and controls the qilin’s animations and appearance. If the qilins hearts reach zero, the qilin will turn into a pile of bones, otherwise it will be a cute qilin.

https://github.com/saphya-council/hatch-spr17/blob/master/heartsystem.cs

Here are our markers:

fdpizza fdcoffee

 

FUTURE PLANS:

We tried to incorporate multiplayer compatibility so that couples can chat and take care of the same qilin. We first did this through the Unity Multiplayer Networking tutorial, however it did not work between PC and mobile phone. This is because there was no server to connect the two devices. Next, we explored Photon Unity after getting advice from Sean. This was better, because Photon uses a cloud that can be accessed just as an API is used. In the short time that we had, we prioritized perfecting the Vuforia camera and the appearance of our app over the networking component. In the future, we hope to finish our work on that part of the app.

 

Baaria and Rewant – Quaternion Identity

PROJECT NAME
Quaternion Identity

PROJECT BY
Baaria Chaudhary & Rewant Prakash
(baaria@nyu.edu, rewantprakash@nyu.edu)

PROJECT DESCRIPTION
Quaternion Identity is a location based, theatre-style augmented reality game experience. The game is set up in the academic building at NYU Shanghai and takes players on a journey to solve riddles and find clues to reach the next level. Players are given an iPhone with the game configured on it, which they then use to scan different markers and unravel different clues. At different stages of the game, players also interact with live actors to access clues and continue the experience. 

The theme for Quaternion Identity game is privacy and data security, inspired by a variety of different events that include fake news in US presidential elections, Ashley Madison data breach, ransomware attacks as well as the recent increase in cyber attacks.

SOFTWARE USED
Unity, Vuforia, IFTTT, OBS, C#, Adobe Premiere, Adobe Photoshop,

HARDWARE USED
iPhone, Skanect, Green Screen,

PROPS USED
Flashdrive, Book, Tea table + experience, 3 rooms with different experiences (/Performing stages)

VIDEO
https://youtu.be/gTMtsK519-k