IxLab: Final Project!

Partner: Jessica Chon

Project Title: Divine Intervention

Inspiration & Ideation

This project, in its earliest form, was a continuation of our midterm, the Kill Me Not Flower. The user feedback we received during the midterm motivated us to continue developing that project. One of the main feedbacks we focused on from the midterm was that many users tended to focus on only one aspect of our project (either the light or the water) because the components were spread out too much and not well integrated. Another lesson we kept in mind from the midterm was how we ran into problems the night before the in-class presentation because of my servo motor.


Concept Development

Moving forward from the midterm, Jessica and I talked to each other a lot about possible ideas for the final. We both agreed that we wanted to incorporate some new elements of Processing we had learned, namely sound, to add more interactivity to our project. At first, we toyed with the idea of having a physical butterfly that contained a microphone so that if the user blew on it, the butterfly would actually fly away. We realized from talking to one of the fellows that butterflies are actually very pretty and if the user saw one, they probably wouldn’t think to blow it away from the flower. So, we decided to make it a fly instead. Jessica did some research and found out that the Whitefly is a common pest to indoor plants. Further considering the potential applications of the microphone as an input, Jessica came up with the idea to have flies appear on the sky/window display such that they gradually grew in number until being blown away by the user.

As for my own development of the sky/window display, I was focused on two specific pieces of feedback from the midterm. One of them was for there to be some kind of base state that would always be there. The second was for the sky display to change not just from the user shining a flashlight on it, but also from the user covering up the flower. To be clear, this also occurred in the midterm but it was done with lerping and simply changed the brightness of the screen.


Coding & Building


When I first started working on the Processing sketch for the sky display, I thought I should make the clouds more realistic looking, so I started swapping out different cloud png’s and seeing how they looked. However, during the concept presentation in class, people said that they had liked the animated/cartoon-ish look of the original sketch. Because of this, I went back to the original cloud image that I used for the midterm. As for the sun, I decided to change to a different image of a sun that I found online. In the original midterm code, the sun would spin as a form of feedback for the user so they knew that shining the light on the plant was having some effect. However, some people gave feedback that this spinning didn’t really make sense in terms of the effect that light would have on a plant. As such, I decided to eliminate the spinning animation.

One feedback I got from Rudi during a recitation was that the various components of the project were still too disjointed. He pointed out to me that when a plant gets too much light, it’s a bad thing, and that I should reflect that somehow on the screen. He also said that it would be better if when the plant was watered, something happened on the sky screen as well. As a result, I ended up moving away from the clouds I had previously been focusing on and found images on google of a bright sunny sky, a nighttime sky, and a desert. The idea was to set light thresholds so that when it was around the normal amount of light usually in the room, the sky would be bright and sunny. If the user shines too much light on the plant’s built-in LDR sensor, the sky turns to a dessert. To integrate the water, I decided that once the plant was watered, it would cause rain on the screen. I got help from this YouTube video with the rain animation code: https://www.youtube.com/watch?v=Yg3HWVqskTQ&t=693s . I looked at several videos and code websites but ultimately this video was the most helpful to me.

Once I integrated Jessica’s code for the flies into the sky, I needed to figure out a way to add a delay so that time could pass before the flies would start to reappear again. Our worry was that if the flies immediately reappeared, the user would just keep blowing them away to get rid of them and never explore other parts of the interaction. I had the idea to do this using a boolean and start counting frames from the time when the microphone input volume exceeded the threshold set. Luis worked with me to figure out how to set the conditions so this would result in the delay I hoped for.


Coding for the LDR sensor was easy because I already had experience working with the LDR for my midterm project. As for the water sensor, Jessica helped me with the code for that since she had worked with a water sensor for the midterm. I had to adjust the code a little bit in terms of naming variables and sending the water value to my Processing sketch with serial comm.

At first, I thought I was going to work with a ultrasonic range sensor for my project, so I spent some time figuring out how the coding for it worked. I was pretty confused at first, because I based my code off of the example on the Arduino website and didn’t really understand how each line of code was functioning. The purpose of using the distance sensor was to know whether or not there was a user standing in front of the project, so that some light changes alone wouldn’t set off the animation I had originally planned to create. However, during class when I asked Antonius a question about my coding, he pointed out to me that I was overcomplicating the process quite a bit. In fact, as it turned out I could accomplish what I wanted to without using the distance sensor at all! This made my life a lot easier.

Serial Communication

The serial communication was pretty straightforward for this project. I needed some help getting the Arduino code working for the sending of multiple values to my Processing sketch. Nicholas showed me I was writing println() when I should have just been writing print(), and this affected how Processing was receiving the values. Once I fixed this, my serial communication worked fine.

Building & Setup

For the flower petals, we kept the flower from our midterm project. I bought the flower pot off of Taobao and we cut a hold in the bottom of it to run our USB cables through to our computer. The stem of the flower is new for this iteration and is a bubble tea straw covered in polymer clay. The dirt for the flower is made of brown polymer clay. Two boxes were used to create the height difference we needed so that both screens could be visible.


User Testing

We conducted the User Testing on our own for this project as was required. The first test was a bit of a fail, as when I detached and reattached my computer’s display screen, the Arduino stopped communicating with my computer and none of the animations of the sky were happening. Luckily, our user tester was very friendly and patient. He was confused about which part of the project he should interact with first, as were our second and third user testers. This made us realize that we needed to write instructions for our users. Due to time constraints, we didn’t do this before the final presentation. As a result, many of our peers wrote on the comments sheet that they thought our project definitely needed instructions.

In addition to this, one of our user testers remarked that even though he knew to blow on the microphone, it probably wouldn’t be intuitive to other users to blow toward the microphone instead of directly at the screen where the flies were appearing. For the IMA show, we made a small sign pointing toward where the user should blow. In the future, it would be great to have this sensor on the flower so that the user’s most instinctual behavior resulted in the flies going away.

The last feedback that I want to note is that users did not recognize our moisture sensors as roots, and they remarked that they wished the water could go directly into the “dirt”. If we were working with a plant that would be used with much longer time between each use, I think we could have tried out having some real dirt or a more realistic simulation of dirt that the user could pour water onto. In this kind of situation, the dirt (real or fake) would have ample time to dry, thus letting the sensor reading return to zero. However, as this was a version that needed to be tested over and over again (and used over and over within a short time span at the IMA show), it didn’t make sense to put the moisture sensors somewhere that we couldn’t easily access them and dry them off. For the record, Jessica used her knowledge of dyes to try and dye our sponges brown so that when we wrapped them around the moisture sensors, they could look more like roots. We ended up with an orange color that we simply did not have time to fix because of other things that needed to be tended to.


IMA Show

Setting up for the IMA show 🙂

Even though it was after the presentation of our project, the IMA show was the place where we got the most user feedback. It was nice to see the looks of amazement on some little kids faces when they blew the flies away or made the sky turn to nighttime. One challenge we did not anticipate was the decent number of people coming to the show who only spoke Chinese. I was able to communicate with them and through trial and error I figured out how to best articulate the purpose of our project using Chinese. I feel like this experience really tested my ability to clearly and concisely explain what our project was about, since not only was I talking to a total stranger, but I was forced to communicate in a simpler way because I do not know any vocabulary about circuits or interactive technology in Chinese. Also, blowing on the flies got a little bit tricky during the show. Because the input value for this is received from the microphone in my computer and the room got quite noisy, I think that’s why sometimes it would randomly start raining. This could have been fixed by taking some time to adjust the thresholds for the volume value that makes the flies disappear. However, I was reluctant to make changes to my code in the middle of the show since it was working for the most part.


Final Thoughts

After having the chance to present at the IMA show, I am still left with many ways I would like to improve our project if we were to make more iterations. I would definitely try and integrate the microphone into the flower so that the user could blow toward the flower to get the flies to go away. An interesting thing I observed during the IMA show was that even though we had instructions, not many readers read them. So, I would figure out a more obvious place to put the instructions, or I would build them into the Processing sketch so that they were embedded within what the user was interacting with. Also, I would like to create a Chinese version of the instructions so that even more people could enjoy the interaction.

Overall, the looks of surprise and happiness on the users’ faces at the IMA show made all the efforts from the semester feel even more worth it, and I am left with an extra component to my personal definition of interaction. Now, more than before, it’s more clear and obvious to me how important the user’s feeling is during the moment when they first interact with something. Of course this varies depending on the interaction– with something like a computer keyboard, we (or at least I) don’t have any consistent feelings on a conscious level about the feedback I get from the computer– that is, when I press “A” and the “A” shows up on the screen, I am not totally amazed and excited and awestruck. If this weren’t to happen, I would get frustrated and angry. However, when moving beyond keyboard and mouse for interacting with digital media, I can tell it is more likely the user will be surprised when finding out about some new behavior they can use to get some response from the digital media, even if it is a simple response, such as the screen going dark or some flies disappearing. In the end, I am excited to see in the future what kinds of human behaviors could eventually become as second-nature to us for interacting with media as a button press is now. Thanks very much to Antonius and all of the fellows (especially Nicholas and Luis– whom I bothered most frequently) for guiding me through the learning process this semester and helping me gain this kind of appreciation for interactive media.



int sensor1 = 0; //connect light sensor to A0
int sensor2 = 1;
int val1;
int val2;
int moisturePower = 7;

void setup() {
 // put your setup code here, to run once:

pinMode(moisturePower, INPUT);
 digitalWrite(moisturePower, LOW);


void loop() {
 // put your main code here, to run repeatedly:
 val1 = analogRead(sensor1) / 4;
 val2 = analogRead(sensor2) / 4;
//int readMoisture() {
// digitalWrite(moisturePower, HIGH);
// delay(10);
// val2 = analogRead(sensor2);
// digitalWrite(moisturePower, LOW);
// return val2;


import processing.serial.*;

ArrayList<Fly> manyFlies;

import processing.sound.*;

AudioIn input;
Amplitude amp;

PImage photo, sunny, night, desert;

Serial myPort; 
String myString = null;
//int val1;
PImage Sunimg;
PImage Cloudimg;

int tooBright, tooDark;

Drops d[];

boolean flygone = false;
int gonetime;

int SUNSIZE = 679;
boolean growing = false;
int growtime;

boolean using = false;
int usetime;

int NUM_OF_VALUES = 2;
int[] sensorValues;

boolean cloudparting = false;
int parttime;
int timer = 0;

float c;
float g;

void setup() {
  size(displayWidth, displayHeight);
  d=new Drops[6000];
  for (int k=0; k<6000; k++) {
    d[k] = new Drops();
   sensorValues[1] = 1;
  c = 0;
  //loading fly image
  photo = new PImage();
  photo = loadImage("fly.png");
  photo.resize(0, 100);
  sunny = new PImage();
  sunny = loadImage("sunny.jpg");
  night = new PImage();
  night = loadImage("night.jpg");
  desert = new PImage();
  desert = loadImage("desert.jpg");

  manyFlies = new ArrayList();
  manyFlies.add(new Fly());
  manyFlies.add(new Fly());
  manyFlies.add(new Fly());
  manyFlies.add(new Fly());
  manyFlies.add(new Fly());
  manyFlies.add(new Fly()); 

  //audio setup
  input = new AudioIn(this, 0);

  amp = new Amplitude(this);

void draw() { 
  //make images smaller than 3000 x 2000

  if (sensorValues[0] <= 200 && sensorValues[0] > 70) {
    image(sunny, 0, 0);
  } else if (sensorValues[0] > 200) {
    image(desert, 0, 0);
  } else if (sensorValues[0] <= 70 && sensorValues[0] > 0) {
    image(night, 0, 0);

  if (sensorValues [1] >= 180) { 
    fill(220, 220);
    rect(0, 0, displayWidth, displayHeight);
    for (int i=0; i<6000; i++) {
      if (d[i].ydrop>height) {
        d[i] = new Drops();


  float volume = amp.analyze()*100;
  //loops through every fly of manyFlies
  for (int i = 0; i < manyFlies.size(); i++) {

    //gets one of the many flies and draw it and move it which is called from other tab "Fly"
    Fly oneOfManyFlies = manyFlies.get(i);
  //depending on random, lets get more flies
  if (gonetime < frameCount && timer == 1) {
    //if (random(1)<0.2) { //&&  frameCount > gonetime+6000
    manyFlies.add(new Fly());


  //if you blow, flies will go away
  if (volume >= 30) {
    //loop previous loop backwards
    if (!flygone) {
      flygone = true;
      gonetime = frameCount +200;
    for (int i = manyFlies.size()-1; i >= 0; i--) {
      //remove flies
      flygone = false;
      volume = 0;


void setupSerial() {
  myPort = new Serial(this, Serial.list()[0], 9600);

  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = 'n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];

void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = 'n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);

class Drops {

  float xdrop, ydrop, speed;
  color q;
  Drops() {
    xdrop = random(width);
    ydrop = random(-1000, 0);
    speed = random(5, 10);
    q = color(255,255,255);
  void update(){
   ydrop += speed; 
  void display() {
   rect(xdrop, ydrop, 2, 15);

//window pane
void windowPane() {
  rect((width/2)-30, 0, 60, displayHeight); //vertical pane
  rect(0, (height/2)-30, displayWidth, 60); //horizontal pane
  rect(0, 0, 90, displayHeight); //left-most pane
  //fill(255, 255, 255); 
  rect(displayWidth-90, 0, 90, displayHeight); //right-most pane
  rect(0, 0, displayWidth, 70); //top pane
  rect(0, displayHeight-70, displayWidth, 70); //bottom pane


class Fly {
  //initial variables for fly
  int x;
  int y;
  float r;

  //constructor makes my fly
  Fly() {
    x = width/2;
    y = height/2;
    //used to make fly rotate random positions
    r = random(0, 2*PI);

  //showing fly
  void showFly() {
    //image(photo, x, y);
    translate(x, y); 
    image(photo, 0, 0);

  void moveFly() {
    //move fly
    x = x + floor(random(-40, 40));
    y = y + floor(random(-40, 40));

Leave a Reply