Final Project Documentation – Jiannan Shi (Sean)



By Ellen Ying and Jiannan Shi.

We believe the emotion of a tree matters.

Developing Ideas:

For this project, Ellen and I decided to work for it together not only because we are both music fan, but we also have the same idea of what “interaction” means, as both of our essays for this project mentioned. When brainstorming for the final project, we both come up with the idea to make an instrument, but it is hard to come up with a feasible plan to do that.

Since I have chosen the topic of human and nature throughout this semester (“Dog and home” for the Stupid Pet Trick, “Dog Finds Master” for the mid-term project), I still want to continue this topic. With the concept “interaction” in mind, I want to find a way to actually interact with nature. So, Ellen and I decided to make a whole picture of four seasons as a playboard and to let the audience control the motion of the elements in that playboard, which would result in the change of pitches in background music at the same time. And we then presented this idea in class. However, we found problems with this idea. First, from the aspect of the technique, this idea is too broad, and it demands multiple Arduino boards if we want to get the interactivity as we have expected. Second, from the aspect of ideas, there is a fallacy in the designing because as human beings, we should live with nature in harmony rather than “controlling” the seasons or controlling nature. But our project is designed to let the user actually control nature, which is not very proper.

To narrow down the scale of our project, Ellen find the final project of Candy Bi in the last semester inspiring. Since we are going to interact with nature, why don’t we make a project to communicate with a tree? We understand that in communication, echo the emotions from one side to the other is essential. We also understand that music, as defined by the dictionary, is a thing that could trigger pleasant emotion. Since we need the instrument to create music, I came up with the idea: it would be cool if we could build a creative, and interactive instrument that can create the music.

Since music is created by the human’s triggering the instrument, the music contains human emotion. And this emotion could be conveyed to the tree, which may trigger the tree’s emotion, and eventually, trigger the color change of the tree’s leaves. Then the human who is playing with the tree would get a sense of how his emotion could actually echo with the emotion of that tree, and the music would be the medium of achieving such an interaction.

Then, how should we shape this instrument? Since this is a device that could interact with the tree, we decided to call it “treenstrument”. I intuitively thought of its literal Chinese translation: 树琴(Shu qin). Surprisingly, 竖琴(Shuqin, harp) has the same pronunciation with the treenstrument! So, the shape of harp would be the best choice to be the treenstrument. Moreover, 琴 pronounces qin in Chinese, which is similar to the pronunciation of 情, qing, which means emotion! The Chinese language exactly explains the interaction between the emotion and music. Finally, we name our project “Treenstrument&Treemotion”, with a poetic Chinese name 《树琴与树情》.

Description of our project:

When the user plugs the string of the treenstrument, it will make a sound, which could trigger the tree in growing up leaves. Different strings represent different sounds and different color of the leaves. The music in higher pitch could make the tree grow gree or yellow leaves. If the user plug the lower strings, the tree would grow purple or red leaves. By showing different musical pitches and color of the leaves, the user’s emotion would be displayed on the screen. By playing with the treenstrument, the user may create his or her own tree out of their emotion.

I am in charge of the part of the treenstrument, in which I chose and connect the sensors, built the circuit using Arduino, made the tree-like shape, arranged and created the music that would be sounding in Processing. Ellen is in charge of the part of treemotion, in which she uses Processing to create the shape of the tree and how the tree would grow according to the interactive device.

In this documentation, I would elaborate the part of treestrument (interaction), and Ellen would describe the treemotion (visual) part in her documentation, which could be directed from this link.

Interaction Part

I met challenges in making this interaction. Since I would consider the feasibility of the project in the first place, building the circuit is the first step in making this project. How to choose the sensor is the first challenge that I met. We came up with several plans, like using the vibrant sensor, conductive fiber, and the 3-axis analog accelerometer is the least choice. However, when actually building the circuit, only the choice of 3-axis accelerometer could fit our expectations despite it is hard to control the accelerometers’ positions in a still way.

For the vibrant sensor, it would only work when it is in a flat platform, with one hand touching on it. But we could not guarantee that all the users would do the same way in plugging strings and get the sensor know the difference in sensoring the vibration. For the conductive fiber, it is plausible to sense the change in resistance of that fiber to get the change in data. however, it could not show the lingering sound after plugging one string: the data would only change once. When the strings are still vibrating, the sensor wouldn’t feel it this way. By using the 3-axis analog accelerometer, I could sense the displacement vector of the string after plugging it, and it makes sense for me to build this circuit. Here’s how it looks like originally:

(Special thanks to Nick who went to the equipment room to give me the list what possible sensors that I can use for this project!)

Initially, I want to use the data “x + y + z” to express the total change in position, but it turned out that there are only limited pins for analog input. So, I decided to just plug the wire in one pin representing the change of position in one axis. I decided which pin to plug by actually testing each string in which direction this string moves the heaviest.

Then, it is time to build the truck-like look of the treestrument. I found two relatively light boxes to be the trunks, and one heavy one to be the base where the two trunks are standing on. This is the first look after I glued them together:

I drilled some holes in the left side of the trunk to let the wires and strings in. Here’s the original version of this treenstrument:

Here comes another challenge: how to make sure that the vibration of each string would not cause the movement of the strings near the vibrating string. I assemble and disassemble this device over and over again, and revised my plan over and over again, and here’s my solution to this problem:

Then, I started to add aesthetic value to my physical project. I bought the fake leaves with a surprisingly low price on Taobao. The leaves would be used not only for decoration, but also as a object in covering the messy wires. I also downloaded a picture of tree bark from the creative common database, and uses software to change its color to make it looks real: I am going to print it and let it cover my trunk.

Here’s the look of the development of the treenstrument:


I use Logic Pro to make the sound that would make after plugging the strings. The sound was sampled from a harp. You can scroll down the screen to watch the video to get a sense of what this music is like!

Visual Part

Ellen is in charge of this part. She uses processing to make all the beautiful effects happen!! When coding, I feel like she is a genius and I could see she could keep solving problems! Without her this interaction between treenstrument and the tree wouldn’t happen. Here’s how she has been working for it: See (Ellen’s Documentation).


We initially wanted to use merely the string to control the audio and visual part directly, but we met coding problem which prevented us from achieving so. We consulted Louis, but he said it might be no other choices but to add another sensor to control the music and visual actually together. Since in a harp the pedal is necessary when playing it, we came up with the idea that we could use a push botton at the ground to be the “pedal”. And this pedal takes its shape from a red leaf that I found in the cardboard room.

Here’s how the final product (with a pedal) looks like:


Ellen and I have put the most of the endeavor for this project as much as we could. I still remember the talks that we have, in the dorm, at the academic building, during class. We would debate on a tiny detail for a long time just for the sake of getting the perfect display of the project. Due to the conflict in schedules of us two, we could only discuss ideas in the evening. During the final week, we both have limited our sleeping time to get together for this project. Although the final result of this project is not the same thing as we designed initially because we have revised it for numerous times, it is still interactive, poetic and artistic, and represents the topic that we originally want to present: the human could interact with nature by means of echoing emotion, and the music is the medium of making the echoing happen.

Project Video:



//-------ARDUINO PART-Jiannan-------------------//
void setup() {

void loop() {
  int sensor1 = analogRead(A1);
  int sensor2 = analogRead(A2);
  int sensor3 = analogRead(A3);
  int sensor4 = analogRead(A4);
  const int buttonPin = analogRead(A0);



//---------------PROCESSING PART-Ellen---------//
import processing.sound.*;
import processing.serial.*;

String myString = null;
Serial myPort;

int NUM_OF_VALUES = 5;   
int[] sensorValues;  

float n = 0; // noise input
PGraphics bg; // background
PGraphics tree;
ArrayList <PVector> leafs;
float minHue, maxHue;
int leafCounter = 0;
LEAF[] lf;
long framestuff = 300;
int valuefromArduino = 0;
PImage bgTree, bgTreeAlpha;
int sen1, sen2, sen3, sen4, leafgrow;

SoundFile sound1;
SoundFile sound2;
SoundFile sound3;
SoundFile sound4;


void setup() {
  size(900, 700); 
  fill(0, 4);
  sound1 = new SoundFile(this, "Sensor1.wav");
  sound2 = new SoundFile(this, "Sensor2.wav");
  sound3 = new SoundFile(this, "Sensor3.wav");
  sound4 = new SoundFile(this, "Sensor4.wav");

  tree = createGraphics(width, height); 
  leafs = new ArrayList<PVector>(); 

  image(bg, 0, 0); // display background  
  image(tree, 0, 0);  //display tree

  lf = new LEAF[leafs.size()];
  bgTree = loadImage("tree.jpg");



void draw() {
    for (int i = 0; i < lf.length; i++) {
    lf[i] = new LEAF(leafs.get(i).x, leafs.get(i).y, i, sensorValues[4]);

    // add your code
    sen1 = sensorValues[0];
    sen2 = sensorValues[1];
    sen3 = sensorValues[2];
    sen4 = sensorValues[3];
    leafgrow = sensorValues[4];
    if (leafgrow == 1023) {
    if ((sen4 < 271 || sen4 > 279) || (sen3 < 200 || sen3 > 207) || (sen2 < 280 || sen2 > 290) || (sen1 < 270 || sen1 > 275)) {
      if (sen4 < 273 || sen4 > 279) {
          valuefromArduino = 1;
      } else if (sen3 < 200 || sen3 > 207) {
          valuefromArduino = 2;
      } else if (sen2 < 280 || sen2 > 290) {
          valuefromArduino = 3;
      } else if (sen1 < 270 || sen1 > 277) {
          valuefromArduino = 4;
    if (valuefromArduino == 4) {
      fill(random(50, 255), 0, random(0, 200), random(40, 100));
    } else if (valuefromArduino == 3) {
      fill(random(50, 255), random(150, 200), 0, random(40, 100));
    } else if (valuefromArduino == 2) {
      fill(0, random(100, 200), random(200, 255), random(40, 100));
    } else if (valuefromArduino == 1){
      fill(random(50, 140), random(50, 80), random(100, 255), random(40, 100));

  } else if (leafgrow < 1023){
    image(bgTree, 0, 0);


void createTree() {
  tree.background(0, 0);  // clear PGraphics (the tree drawn last time)
  for (int i = 0; i < 3; i++) {
    tree.fill(map(i, 0, 2, 60, 20));
    branch(width/2, height, 70, -HALF_PI, 150, 0); // draw the branches


void branch(float x, float y, float bSize, float theta, float bLength, float pos) {  // (start pos X, start pos Y, Branch size, Angle, Branch length, 
  n += 0.01;  // increment noise input
  float diam = lerp(bSize, 0.7*bSize, pos/bLength);  // gradually reduce the diameter
  diam *= map(noise(n), 0, 1, 0.4, 1.6);  // multiply by noise to add variation

  tree.ellipse(x, y, diam, diam);
  if (bSize > 0.6) {
    if (pos < bLength) {
      x += cos(theta + random(-PI/10, PI/10));
      y += sin(theta + random(-PI/10, PI/10));
      branch(x, y, bSize, theta, bLength, pos+1);
    } else {
      leafs.add(new PVector(x, y));  // add a leaf at the intersection
      boolean drawleftBranch = random(1) > 0.1;
      boolean drawrightBranch = random(1) > 0.1;
      if (drawleftBranch) branch(x, y, random(0.5, 0.7)*bSize, theta - random(PI/15, PI/5), random(0.6, 0.8)*bLength, 0);
      if (drawrightBranch) branch(x, y, random(0.5, 0.7)*bSize, theta + random(PI/15, PI/5), random(0.6, 0.8)*bLength, 0);

      if (!drawleftBranch && !drawrightBranch) {  // if none of the branchs are drawn, draw a tip
        tree.translate(x, y);
        tree.quad(0, -diam/2, 2*diam, -diam/6, 2*diam, diam/6, 0, diam/2);


void createBackground() {
  bg = createGraphics(width, height);
  for (float diam = 1.5*width; diam > 0.5*width; diam -= 20) {
    bg.fill(map(diam, 0.5*width, 1.5*width, 255, 210));
    bg.ellipse(width/2, height/2, diam, diam);


void setupSerial() {
  myPort = new Serial(this, Serial.list()[ 3 ], 9600);

  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = 'n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];

void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = 'n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);

class LEAF {
  float x, y;
  float size;
  color c;
  float jitterX = random(-40, 40);
  float jitterY = random(-40, 40);
  float alpha = 255; //random(10, 40);
  float h;
  float minHue, maxHue;
  float leafgrow;
  LEAF(float posx, float posy, float col, int button) {
    x = posx;
    y = posy;
    size = random(0, 20);
    leafgrow = button;
    float rdn0 = random(255);
    float rdn1 = random(255);
    minHue = min(rdn0, rdn1);
    maxHue = max(rdn0, rdn1);
    //c = color(0, random(50, 255), random(0, 200));
    h = map(col, 0, leafs.size(), minHue, maxHue);
  void display() {
    //fill(h, 255, 255, alpha);
    if(leafgrow == 1023) {
    ellipse(x + jitterX, y + jitterY, size, size);

One thought on “Final Project Documentation – Jiannan Shi (Sean)

  1. […] In the interaction part, we decided to create a harp which looked like a tree. Since users were going to interact by touching the string, the best sensor to sense this motion was 3-axis accelerometer sensor. We used 4 sensors to separately sense the motion of the four strings. Nan was responsible for making the harp and connecting the circuit (See Nan’s documentation). […]

Leave a Reply