Final Documentation for “Bus Route 638”

Title: Bus Route 638

Director & Cinematographer: Yufeng Zhao

Characters: Driver Xu and an anonymous driver

Logline: Through casual talks with bus drivers of 638, the busiest route in Pudong, Shanghai, I hear the heartbeat of the city from another perspective and look closely at how people interact with their daily routine bus while personally being a 638 rider for 14 years myself.

Synopsis:

638… I’ve been riding this specific bus for fourteen years but have never looked at it as closely as it’s connected with my whole life path. I met two friendly drivers who are willing to share their experience with me, about the route, the driving, the salary, and even the changes in Shanghai since the route was established 30 years ago. Generally, drivers are not allowed to chat when on duty, so they talked a lot when I interviewed them at late night after work. From early morning, they spend over 10 hours on the bus every day until they see city lights turned on, and sometimes until lights off. Passengers come and go, and benefit from their hard work almost without realizing it… Maybe the bus drivers won’t care, or maybe they do so that most of them smoke to resolve. And maybe viewing the bus in different time of day with them talking is a good way for me to present and for you to perceive it.

Tech:

A website based on HTML5 video and subtitles

Sony A7RII & A7SII mostly with a Canon EF 24-105 F4 lens on a Sigma adaptor ring

Sony M10 Recorder & Rode VideoMic Pro for field record

Aputure lav. mic on an iPhone for interview record

A Shanghai Transportation IC card to take 638 for 2 weeks

LinkGoogle Drive link for all files

(website not online yet, please download them and open the index.html in localhost mode to load English subtitles for browsers not supporting that with local files)

Screenshots:

 

VR/AR : vr title & vive review | Linda Yao

VR/AR : vr title : accounting

This game was so weird but I loved it. I remember when my classmates were talking about it in class I literally thought it was a math game. But oh was I wrong. The game had a lot of dark humor and it definitely triggered a lot of emotions. The graphics were made well and the instruction was straightforward. The game did make a little dizzy as the world were flashing a lot and not still. I am not sure if it is because of the platform or the game itself. I think if the game was hosted on Oculus, it would be interesting to see how the controllers could further enhance the game.

Vive Review
Vive’s headset is steadily and more on the heavy side. The setting up process was a little hideous as there are more parts to the vive and required more space. After pairing the controllers, the playing experience using the Vive was great. The controllers themselves had simple functions and I enjoy how their is a straightforward exist key.Overall the gaming experience is great on the Vive.

NOC-Final Project(Cyndi)

Date: Dec.14th

Name: Cyndi

Instructor: Moon

Process:

My nature of code final project turned out to be a very different thing from what I expected from the very beginning, but I kinda like it.

I got my inspiration from two pictures that were very popular online. When the same picture is viewed by different people, different effects were triggered. So I really want to make filters that allow people to see different things behind the filters.

One big part of my project is the circle packing part. It is mainly pixel manipulation. First, it read the pixel values from the image I loaded. Then, I write another function to generate certain shapes matching the pixel color and the position of the original image.  I can actually choose the shape of what is filled in the position of the image. One thing worth noticing is that the pixel density should always be set to one. At first, I really wanted to create different scenes of Shanghai, such as raining scene or night scene, so I chose a picture that shows the famous attractions of Shanghai.  I adjusted the color of the picture, one is for the rainy scene and one is for the night scene.

For the night scene, I decided to add stars to the sky. I really wanted to apply the trail effect to the night scene, but since I already had the pixel version of Shanghai night, I didn’t know how to make adjust the transparency of the layer underneath the stars. After talking to Moon, he taught me to create another graphic for the star, so I can adjust the alpha value of the background of the graphic I created. Also, I learned more about the blend mode in p5.js. The effect was just amazing. Everything turned out to be great.

For the rainy scene, firstly I added some random grey particles to make it seem foggy. Next, I used the flocking system to make the raindrops to be parallel with each other. To make it more like a rainy scene, I created puddle function to make small circles when the raindrops fell on the ground. I also used the create slider function in p5 dom library to add filters on the canvas. I used two sliders in total, one for controlling the size of the ellipse and another is the transparency.

The biggest problem I met in the process is actually the speed of my system, especially when the raindrops were falling down, the speed was just super slow. I tried to put all my code into open framework to make it smoother, but my codes are sort of complicated and I got so many errors. After talking to Moon, he suggested me to splice out the particles when they reach certain values. When I learned the splice thing I thought it was just for making the scene less messy, and until then I knew that splicing out particles contributes a lot to make the system run faster.

When I was designing this project at the very beginning, I want to create the world behind filters, so I wanted everything to be dark at first, and when the filter is on, different scenes will be revealed only at the place where the filter is. After talking to Moon and Jiwon, I tried the mask function in p5.js and also the color erase method to imitate the feeling of the filter, but both of them didn’t turn out to look good, so I had to give up. After splicing particles out, I found out the effect was surprisingly good because all the pixels are shining and moving, and it was better with the filters on!

I was going to do a third scene of the city, then Moon suggested to try something else, maybe something with the really high comparison. So I downloaded a cartoon face from google and used blobs to make the contrasts.

To keep the system running at a fast speed, I actually have to do the splicing thing on almost every image I loaded. It worked well on some of the pictures, but for other pictures, I cannot recognize what the original picture is when it is “translated” into pixels.

For further exploration, I really wanted to do the mask thing. Instead of just color filters, I wanted to apply more to it. Also, I want the pixel manipulating thing can be applied to almost every picture, so I may have to adjust my code in the circle packing part.

Reflection:

I went through a lot of struggles during the process of making this process. Although this is not I wanted at the beginning, I still very like the visual I created. For the next step, I really want to make a real filter which can reveal and hide scenes from people’s eyes. Last but not least, I just want to thank Moon and other fellows for all the help they provided! They have been really kind and useful to me. I really learned a lot from Nature of Code!

My code:

 

(IL-Moon) Final Project: Twinkle with Fingers (by Eos)

Date: December 7th

Documented on: December 7th

Documented by: Yu Xu (Eos)

Instructor: Moon Junghyun

Demo

Project Description

Twinkle with Fingers is basically a music visualizer: User can mix the music clips by gestures and visualize it as lighting art. The lighting visuals consist of a 2- meter RGB LED strip, whose colors change according to the change of music amplitude, and a projection of the graphs on Processing screen(colorful ellipses move with users’ fingers and their sizes change according to the change of music amplitude, too.).  There are three music clips for users to play with: change the amplitude with hand vertical movement and mix the clips with parallel movement.

Conceptual Development

Inspirations & Motivation:

I, (and I believe everyone), love those twinkling fancy artworks very much! Also, I have the passion for creating such stuff using LEDs since midterm. Here are some specific inspirations & references that contribute to my idea:

1.Leap Motion Apps

There are many apps designed for leap motion, which allows users play with the visuals on the screen using gestures / finger positions. I think this is a good idea to use flexible hand movement as an “analog input” rather than forcing users to trigger a sensor. Users thus have the freedom to make own design instead of following the fixed setting of a project. However, one problem with existing leap motion apps is that few of them can create multi-sensory interaction, such as music (listening) and graphs (visual). Another problem is that those apps are set in a flat screen so they don’t have a physical form. This will reduce the visual impact on users and people’s interest can be distracted from it in a short time.

ReferenceThe Leap Motion Controller Hands-On Demo – Best Apps of Airspace HD

2.Music Fountains

I was at the world famous music fountain show in Lima, Peru this summer. The visual impact is really amazing with generative arts projected on the water and everything changes with music. This is a good example of physical form artwork which combines listening and visuals together. If this kind of thing is made into an interaction project, it will be more popular because people turn into users who can engage in the design instead of audience who stand there still.

Reference: Peru – Dancing Fountains of Lima

3.DIY LED Bottles

This is a kind of DIY artwork that makes me fascinated with LEDs. It is a good decoration artwork but the problem with it is same as the music fountain’s: no interaction. Also, it’s not a multi-sensory project.

Reference: Bottle Brights – 6th DIY of Christmas!

4. Existing Music Visualizer Projects

There are a lot of simple music visualizer projects using Arduino. The problems with it: 1. the LEDs and circuits are exposed outside, resulting in lack of beauty. 2. most of them are not interactive.

Reference: Waren’s L.E.D Music Visualizer – Rainbow Veins by Owl City

Technical Development

1.Design Stage

How my project is going to work

Original Sketch

A acrylic box contains glass marbles, a LED strip and a foam screen. A projector outside of the box to do graph projection. Everything turns out to be realized instead of the fans and the flying foam screen! In practice, I found foam’s absorbability is too strong and when they are blown by fans, they tend to stick on the box instead of flying in the air. So at last, I stick the foams on the box directly to do the projection.

Materials Needed

Acrylic box * 1, glass marbles * 400, foam, 2-meter-long LED strip * 1, power jacket * 1, 12V adapter * 1, LED strip driver * 1, Jumper cables, Arduino UNO * 1.

Schematic

The LED controller in picture 1 is actually the driver in picture 2. Since I’m just using one strip, the chainable output is not used.

Real-life schematic

On the driver’s input side, CLK, DIO and GND are used while NC is not used.

2. Implementation Stage

Fabrication

Glue glass marbles on the box

Glass marbles are really reflective materials and they can create super pretty lighting artworks. Hot gluing them on the box and let the LED shines through them is a good way to create stronger visual impact.

400 marbles, glued one by one, 3 hours.

Done!

How beautiful they look with LEDs.

The foam I’m using. Pic credit to Taobao.

Make up the box and make a foam screen using transparent adhesive tape! Fabrication done!

Prototypes & Experiments

Experiments

First, for the Arduino part, make the LEDs turn RGB respectively with the example code in the LED strip library (available in: ima.nyu.sh- equip- led strip driver- documentation page).

For the Processing part, use mouseX as experiment before using finger position. Divide the canvas into three columns. Each column contains one music clip. By moving mouseY, you change the amplitude. By moving mouseX, you change the clips. (This part will be discussed more in Lessons learned- trials and errors.)

Sound Clips credit to:

  1. a sweet place: A sweet place
  2. violin: Beautiful violin music
  3. symphony: Symphony Sound

Then, I used the sample AmplitudeRMS code in sound library to test amplitude analysis [rms.analyze()] and made the serial communication with Processing and Arduino. Note: this experiment uses one fixed music clip.

However, in this experiment, the LED strip is blinking and does not change accordingly to the volume. To address this problem: 1. double check the code for serial communication, 2. smooth the volume, 3. give less delay to LED, 4. introduce the concept of sin function and ampOffset (These will be discussed in detail in Lessons Learned- trails & errors).

After making the adjustments, the LED visualizer works like this:

The effect after sticking the LEDs on glass marbles:

OK, the visualizer is basically working! Then I started to consider what kind of graphs should be projected. I used to think of galaxy simulation, but I found that it is a little beyond me and I don’t want to copy any of others’ code. So I thought of colorful ellipses that have trajectories inspired by the random color and opacity function that are talked about in class! The images of colorful ellipses also correspond to the presence of glass marbles.

However, I tried to adjust the transparency of the background as we did in class, I found that the background color would not be pure white after the ellipses moved. I turned to fellow Jiwon and we found the “easing” function on Processing.org and I created an array of ellipses. (To be discussed in Lessons Learned)

I thought of changing the diameters of the ellipses into volume values but the effect was bad when the music is too quite or too loud. So I asked some peers to do user testing, they agreed that fixed diameter ellipses are more interesting to play with: they can be in love or in a fight depending on how they move fingers.

After the colorful moving ellipses are in place, I did the test of projection. It looks cool on smooth surface like papers. So afterwards, I tried to make my foam screen also smooth and in uniform distribution.

 

Lessons Learned

Trials & Errors and Know-hows

1.Why my Processing can’t do “loop” and always says “could not run the sketch” when I tried to edit music clips?

Even without any problem in code, this error still occurs and my project was once stuck because of this bug. I asked fellows for help, and after seeking solutions on Processing.org, we found that we can use code to restart the music like this:

In the global area, create these values:

int duration;

float startTime = 0;

int duration1;

float startTime1 = 0;

int duration2;

float startTime2 = 0;

Then, in void setup, we introduce a function for timing effects called millis() (Reference page here):

 duration = int(asweetplace.duration());

  println(duration);

  startTime = millis();

  duration1 = int(symphony.duration());

  println(duration1);

  startTime1 = millis();

  duration2 = int(violin.duration());

  println(duration2);

  startTime2 = millis();

In void draw, we uses the timing function to loop the sound when it ends:

if (startTime + duration * 1000< millis()) {

    println("start again!");

    asweetplace.play();

    startTime = millis();

  }

  if (startTime1 + duration1 * 1000< millis()) {

    println("start again!");

    symphony.play();

    startTime1 = millis();

  }

  if (startTime2 + duration2 * 1000< millis()) {

    println("start again!");

    violin.play();

    startTime2 = millis();

If you only want to play three sound files, this method is fine. But for me, it doesn’t work when I plan to edit the sound file (change the amplitudes). So trial 1, failed.

Then I tried to use minim library as suggested by Prof. Moon. Failed again and it says: volume is not supported.

Finally I find that in the sample code, the sound file is mono and the mode is .aiff. But the Processing.org says sound library supports mp3 and wav, which is my sound files. However, after the music clips are transferred into mono and .aiff, they work perfectly!

2.Why my LED is blinking regardless of the music? How to make the visualization work?

After double-checking my original code, I didn’t find any mistakes other than the long delay. And after Prof. Moon helped me check the code, we found that it is the fault of serial communication in the Arduino part, which I copied and pasted from the examples: the code should not begin with “if (Serial.available())”, it should be while!

When the LED works constantly, new problems come. The volume derived from analyze function changes too fast and the LED seems to work abnormally. In this case, “lerp()” function is introduced for smoothing! Reference page here

Basically, how lerp functions work is similar to a mathematical idea called linear interpolation:

The aim is similarly to make the journey from x0 to x1 slower.

To prevent the amplitude bouncing quickly from one value to another, it is smoothed by crossing the point (x,y). Then, use the smoothed amplitude.

After smoothing, the idea of sin function and ampOffset is introduced by Prof. Moon. By claiming “ampOffset += amp”, it adds the values of the right to the left and the LED can reflect the accumulated amplitude effect. Afterwards, the ampOffset value should be put into sin function, whose range is -1 to 1, and map it from 0-255. Finally assign these values to R,G,B and send them to Arduino.

3.How to create an ellipse with trajectory but without tradeoff effect on the background?

I used to create colorful ellipses with trajectory using a rectangle with opacity whose center is 0,0. However, it works only with dark background. When using white one, it will be dyed.

After turning to Processing.org, I found a useful function called easing. Reference page here. In this method, I basically created an array of 50 ellipses and make them follow the first one controlled by leap motion. Also, change their opacity to make them look more transparent.

Tips & Conclusions

1.Individual work can be more efficient and meaningful! My midterm project is done with Phyllis so I want to try making a project on my own in the final! Actually, I found individual work is better: no chance for being a free-rider, freedom to arrange your time and knowing better about my weakness. I feel I have learned more by doing on my own.

2. Turn to the right person for help! Fellows have their own strengths, make sure you turn to those who are experienced of the skill. (Thanks to Jiwon for introducing me to sound editing softwares!)

3.Try not to have others write the code for you. A good way to learn may be: When you are struggling with something, ask for the direction to go and do trials on your own. I used to say: “I can’t do this” but now I’ll say: “Let me try first.” If there’s anything goes wrong, turn to Processing.org or Arduino reference page for help first.

4.Try to start doing your project as early as possible! There may be a ton of bugs and errors to deal with. Though I started almost two weeks before the due, I still feel a little bit hurry.

 

import processing.serial.*;
import processing.sound.*;

int NUM_OF_VALUES = 3;
Serial myPort;
String myString;

int values[] = new int[NUM_OF_VALUES];

int prevArea;
int amp;
float ampTarget;
float volume;
int ampOffset= 0;
Amplitude rms;

SoundFile asweetplace;
SoundFile symphony;
SoundFile violin;

float fingerIndexX, fingerIndexY;
float fingerMiddleX, fingerMiddleY;
float fingerRingX, fingerRingY;

float[] x1 = new float[50];
float[] y1 = new float[50];
float[] x2 = new float[50];
float[] y2 = new float[50];

boolean x;
boolean y;
boolean z;


void setup() {
  fullScreen();
  //size(500, 500);
  background(255);
  setupLeapMotion();

  asweetplace = new SoundFile(this, "a sweet place.aiff");
  symphony = new SoundFile(this, "symphony.aiff");
  violin = new SoundFile(this, "violin.aiff");

  initial();

  rms = new Amplitude(this);
  rms.input(asweetplace);


  for (int i = 0; i < x1.length; i++) {
    x1[i] = 0;
    y1[i] = 0;
    x2[i] = 0;
    y2[i] = 0;
  }

  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 1 ], 9600);
  // check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index of the port
  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = 'n'  Linefeed in ASCII
  myString = null;
}

void initial() {
  x=false;
  y=false;
  z=false;

  asweetplace.loop();
  violin.loop();
  symphony.loop();
}

void draw() {

  background(255);
  noStroke();

  updateLeapMotion();


  int area = 0;
  
  if (fingerMiddleX < width/3) {
    x=true;
    y=false;
    z=false;
    area = 0;
  } else if (fingerMiddleX >= width/3 && fingerMiddleX <= width*2/3) {
    y=true;
    x=false;
    z=false;
    area = 1;
  } else if (fingerMiddleX > width*2/3) {
    z=true;
    x=false;
    y=false;
    area = 2;
  } 

  if (x) {
    asweetplace.amp(map(fingerMiddleY, 0, height, 1.0, 0.0));
  } else {
    asweetplace.amp(0);
  }
  if (y) {
    violin.amp(map(fingerMiddleY, 0, height, 1.0, 0.0));
  } else {
    violin.amp(0);
  }
  if (z) {
    symphony.amp(map(fingerMiddleY, 0, height, 1.0, 0.0));
  } else {
    symphony.amp(0);
  }
  
  
  if (area == 0 && prevArea != area) {
    println("yes");
    rms.input(asweetplace);
  } 
  if (area == 1 && prevArea != area) {
    println("yes1");
    rms.input(violin);
  }
  if (area == 2 && prevArea != area) {
    println("yes2");
    rms.input(symphony);
  }
 prevArea = area;
 
 volume = rms.analyze() * 500;


  ampTarget = map(volume, 0, 500, 0, 255);
  amp = int(lerp(amp, ampTarget, 0.2)); //smooth
  ampOffset += amp;

  int red = int(map(sin(ampOffset*0.05), -1, 1, 0, 255));
  int green = int(map(sin(ampOffset*0.03), -1, 1, 0, 255));
  int blue = int(map(sin(ampOffset*0.01), -1, 1, 0, 255));

  values[0] = red;
  values[1] = green;
  values[2] = blue;

  //printArray(values);

  sendSerialData();
  
  for (int i = 0; i < x1.length; i++) {
    float easing = map(i, 0, 50, 0.01, 0.3);
    float opacity = map(i, 0, 50, 0, 255);
    float targetX1 = fingerIndexX;
    float dx1 = targetX1 - x1[i];
    x1[i] += dx1 * easing;
    float targetX2 = fingerRingX;
    float dx2 = targetX2 - x2[i];
    x2[i] += dx2 * easing;

    float targetY1 = fingerIndexY;
    float dy1 = targetY1 - y1[i];
    y1[i] += dy1 * easing;
    float targetY2 = fingerRingY;
    float dy2 = targetY2 - y2[i];
    y2[i] += dy2 * easing;

    fill(random(255), random(255), random(255), opacity);
    ellipse(x1[i], y1[i], 70, 70);
    ellipse(x2[i], y2[i], 70, 70);
  }
}

void sendSerialData() {
  String data = "";
  for (int i=0; i<values.length; i++) {
    data += values[i];
    //if i is less than the index number of the last element in the values array
    if (i < values.length-1) {
      data += ","; // add splitter character "," between each values element
    } 
    //if it is the last element in the values array
    else {
      data += "n"; // add the end of data character "n"
    }
  }
  //write to Arduino
  myPort.write(data);
}


void echoSerialData(int frequency) {
  //write character 'e' at the given frequency
  //to request Arduino to send back the values array
  if (frameCount % frequency == 0) myPort.write('e');

  String incomingBytes = "";
  while (myPort.available() > 0) {
    //add on all the characters received from the Arduino to the incomingBytes string
    incomingBytes += char(myPort.read());
  }
  //print what Arduino sent back to Processing
  //print( incomingBytes );
}

Final Documentation for “Dream Girls”

Title: Dream Girls

Names of Team & Roles:

CAMERA/SOUND:

Alanna Bayarin, Amber Lin, Shiny Wu

PRODUCERS:

Alanna Bayarin, Amber Lin, Shiny Wu

DIRECTORS:

Alanna Bayarin, Amber Lin, Shiny Wu

Names of People in your film: 

Logline:

After relocating from different corners of the world to Shanghai, Kimberly, Mo Meaux, Fantasia and Ennis find themselves taking the stage by storm performing drag.

Synopsis: 

From the dressing room to the stage, this film follows four of Shanghai’s prominent drag performers: Kimberly Kumswell, Mo Meaux, Fantasia Valentina, and Ennis FW. We talk to one of Shanghai’s biggest MC’s in the drag scene, Kimberly, who has performed in the drag capital, NYC, but prefers Shanghai for its unique, close-knit community. Mo Meaux, a professional dancer, moved to Shanghai with a dance company, starting out in the high-end club circuit, making her way to teaching dance classes in a studio. Fantasia, from Zhejiang, China, talks about navigating Shanghai drag scene that is largely dominated by expat performers and how Chinese culture influences the scene. Shanghai’s first drag king, Ennis FW, puts on her final performance in Shanghai as she is returning to the states for grad school because she doesn’t feel fully incorporated into Shanghai life since she doesn’t speak Chinese. All four of the performers have played a large role in evolving the Shanghai drag scene.

Tech/Materials:Link/Photos of Project:

TECH/MATERIALS

Camera:

-Canon EOS D60

Sound:

-TASCAM DR-40

-lavalier

microphone

-shotgun

microphone

Installation:

-projector

-two lab tables

-two large mirrors

-male dress form

-sewing machine

-various pieces of

fabric/clothing/

makeup