Color Landscape with Perlin Noise

I used an ASCII code for serial communication from Arduino to Processing, which changed the colors and speed of a Perlin noise animation.

Circuit:

IMG_3540

  • red cable connects power to 5V
  • green cables connect tilt sensors to power (green switches are side-to-side, white switches are up-and-down)
  • black cable connects ground to ground
  • 10 k resistors connect switches to ground
  • blue cables connect breadboard pins to Arduino pins:
    • breadboard pin 8 to digital pin 8 = Left (1,0)
    • breadboard pin 18 to digital pin 7 = Right (0,1)
    • breadboard pin 23 to digital pin 13 = Up (0, 1)
    • breadboard pin 27 to digital pin 12 = Down (1,0)

*testing the side-to-side values by commenting out the up and down values

3D Perlin noise:

Code Source: Processing 3D Noise example

*below the increment is changed to -.01 and a void mousePressed(){ } function added to change from greyscale to color

Link: https://processing.org/examples/noise3d.html

int x = 0;
int y = 0;
float increment = -.01;
//noise function argument #3 (a global variable that increments one per cycle)
float zoff = 0.0; //incremement zoff != xoff or yoff
float zincrement = 0.02;

void setup() {
size(640, 640);
background(0);
frameRate(30);
}

void draw() {
//adjust noise detail
noiseDetail(8, 0.65f);
loadPixels();
float xoff = 0.0; //start xoff at 0
//for every x, y coordinate in a 2D space, calculate a noise value
//and display a brightness value
for (int x = 0; x < width; x++) {
xoff += increment; //increment xoff
float yoff = 0.0; //for every xoff, start yoff at 0
for (int y = 0; y < height; y++) {
yoff += increment; // increment yoff
// calculate noise and scale by 255
float bright = noise(xoff, yoff, zoff)*255;
// set each pixel onscreen to a grayscale value
pixels[x+y*width] = color(bright, bright, bright);
}
}
updatePixels();
zoff += zincrement; // increment zoff
}
void mousePressed() {
fill(pixels[x+y*width] = color(255));
}

A few times people have described tinnitus to me as the sound that noise looks like on a TV. I found a code for 3D Perlin noise to generate an index of pixels on the screen that look like noise but can be manipulated to simulate textural gradients. Below is a diagram of 2D noise, showing the pixel coordinates in the algorithm, moving in the direction of the arrows. The arrows represent the xoff and yoff variables in the code, controlling the direction in which the pixels appear to randomly move in during the for() loop. The math explains the calculation for this randomness: the gradient is made up of vectors, so the code uses vector coordinates from within each cube (3D pixel) and a point on its edges.[1] The code generates a pseudorandom vector gradient, so the noise pixels seem to move randomly but actually move the same way according to the input integers.[2]

Screen Shot 2017-05-20 at 10.48.20 AM

image source: https://gamedev.stackexchange.com/questions/23625/how-do-you-generate-tileable-perlin-noise

[1] http://flafla2.github.io/2014/08/09/perlinnoise.html

[2] http://flafla2.github.io/2014/08/09/perlinnoise.html

The noiseDetail function changes the appearance by increasing the number of pixels and decreasing their size, so that the grid of pixels is denser. Because the increment functions, controlling the rate at which the pixels move, decreased the frameRate from 30 to a range between 5 and 10, I tried to change it as little as possible. Instead, I focused on color and sound because I couldn’t marble the texture without decreasing the frameRate. I planned to make a pixel that marbles the noise texture in the direction of the maze’s motion, signaled by the tilt switches on the on the bottom of the maze, but the cables kept coming out. Then I decided to keep the cloudy texture and make a changing noise landscape.

*above: this is the only video with sound; when I wanted to represent the player with the marble as a pixel, I used (pmouseX, pmouseY, mouseX, mouseY) to draw a line tracing the movement to change the noise from the library example PinkNoise along with the pixel index statement, color(1*mouseX/1.5,bright,(mouseY/1.5)*255), to change with the sound amplification.

*below: I drew a rectangle with the mouse because I didn’t like the idea of representing the player, so I considered drawing its perspective in the maze by connecting lines from the corners of each angle to the corners of the screen. I decided to just work on the animation as a separate project; my ideas for the maze and the noise seemed better off separate.

I tested the code with boolean statements, then replaced them with if ( ){} / }else if( ){ function for the serial communication. With the if/else function for the side-to-side tilt, the left switch changed the screen from colorful noise, indicated by println(“flat”) in Processing and (0 , 0) from the Arduino, to saturated hues; separately, the if/else statement for the right tilt switched the screen from the same colorful noise – showing that the board was not tilted – to a more striated, and consequently more slowly moving, yellow screen. I multiplied the yoff increment by 5 to stretch the bands of color sideways, across the x-axis.

Screen Shot 2017-05-20 at 2.10.20 AM

*noise with the xoff increment increased

I worked on making layers with different variables and using the map function, but both decreased the frameRate. In the version that I presented, the color mode changes from HSB for the left tilt and to RGB for the right so that the color shifts from saturated while tilted left, to grainy while flat, and to RGB color while tilted right. I added a Perlin noise wave also to cover the screen where the up tilt switch meets the side to side, so that the noise wave would appear like a wavey landscape whenever the tilt switch pointed down. Processing crashed before I saved my most recent edit, which changed several of the vertices so that the wave was wider at the edges of the screen.

Source for 2D Noise Wave: https://processing.org/examples/noisewave.html

*original code in Processing examples

float yoff = 0.0;       // 2nd dimension of perlin noise

void setup() {

size(640, 360); }

void draw() {

background(51);

fill(255);   // We are going to draw a polygon out of the wave points

beginShape();

float xoff = 0;       // Option #1: 2D Noise

// float xoff = yoff; // Option #2: 1D Noise

// Iterate over horizontal pixels   for (float x = 0; x <= width; x += 10) {

// Calculate a y value according to noise, map to

float y = map(noise(xoff, yoff), 0, 1, 200,300); // Option #1: 2D Noise

// float y = map(noise(xoff), 0, 1, 200,300);   // Option #2: 1D Noise

// Set the vertex

vertex(x, y);

// Increment x dimension for noise

xoff += 0.05;   }

// increment y dimension for noise

yoff += 0.01;

vertex(width, height);

vertex(0, height);

endShape(CLOSE); }

To improve this project, I would use different sensors to shift the colors more smoothly. I think that analog would work better for this project than digital, so that the colors would change with a range of values up to 255 instead.

ARDUINO tilt sensor:

int buttonPinL = 7;    //button pin left
int buttonPinR = 8;   //button pin right

int buttonPinU = 13;  //button pin up
int buttonPinD = 12;  //button pin down


void setup() {
  // put your setup code here, to run once:
  pinMode(buttonPinL, INPUT);
  pinMode(buttonPinR, INPUT);
  pinMode(buttonPinU, INPUT);
  pinMode(buttonPinD, INPUT);

  Serial.begin(9600); //baud rate

}

void loop() {

  //LEFT RIGHT 
int valueL = digitalRead(buttonPinL);
Serial.print(valueL);                     //read serial value L (1/0)
Serial.print(",");              //separate the values

//update value
int valueR = digitalRead(buttonPinR);    //read serial value R (0/1)
Serial.print(valueR);
Serial.print(",");              //separate the values

  //UP DOWN
int valueU = digitalRead(buttonPinU);
Serial.print(valueU);                       //read serial value U (0/1)
Serial.print(",");                  //separate the values

//update value
int valueD = digitalRead(buttonPinD);       //read serial value D (1/0)
Serial.print(valueD);

Serial.println();
delay(10); //delay 10 milliseconds

}

PROCESSING:

import processing.serial.*;
String myString;    //0/1 values in Arduino
Serial myPort;


//SERIAL COMMUNICATION FROM ARDUINO
int tiltL;     //declare variables for left and right
int tiltR;
int tiltU;    //declare variables for up and down
int tiltD;

//PERLIN NOISE
int x = 0;
int y = 0;
int Wx = 0; //
int Wy = 0;

float xoff;     //for manipulation later
float yoff;
float increment = .01;
float xincrement = .02;    //variable for x increment
//noise function argument #3 (a global variable that increments one per cycle)
float zoff = 0.0; //incremement zoff != xoff or yoff
float zincrement = 0.02;
float bright;
float Wyoff = 0.0; //

void setup() {
  background(0);
  size(640, 640);
  frameRate(30);     //30sec new random location for pixels

  printArray(Serial.list()); //list serial devices
  myPort = new Serial(this, Serial.list()[1], 9600); //new port, common data rate
  //clear out the buffer of the port
  myPort.clear();
}

void draw() {
  while (myPort.available() > 0) { //available function (not a variable)
    //put what is in my port into the string
    myString = myPort.readStringUntil(10); //10 ASCII = Serial.println
    //    println(myString);
    //condition to test whether my string is null
    if (myString != null) {
      //      println(myString);
      //split and trim data from Arduino
      String[] data = split(trim(myString), ","); //data from Arduino, split by commas
      //use a loop to print data
      //      print(data);
      for (int i = 0; i < data.length; i++) { //initial value, length of data array
        print(data[i]);
        print(",");
      }
      tiltL = int(data[0]); //turn data into an integer
      tiltR = int(data[1]);  
     tiltU = int(data[3]);
     tiltD = int(data[4]);
      println();
    }
  }
//LEFT TILT saturated color
  if (tiltL == 1) { 
   colorMode(HSB);
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        // set each pixel onscreen to a grayscale value
        pixels[x+y*width] = color(bright, bright, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
     println("left");
 //FADE SATURATION INTO STATIC
  } else if (tiltL == 0) { 
       colorMode(RGB);
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        float hue = noise(xoff, yoff, bright)*255;
        // set each pixel onscreen to a grayscale value
        pixels[x+y*width] = color(bright, hue, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
      println("flat");
  }
//FADE STATIC INTO COLOR
    if (tiltR == 0) { 
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        float hue = noise(xoff, yoff, bright)*255;
        pixels[x+y*width] = color(bright, hue, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
      println("flat");
 //RIGHT TILT RGB color
  } else if (tiltR == 1) {
  noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 0; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 0; y < height; y++) {
      yoff += increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
    pixels[x+y*width] = color(bright, 255, 139);
  }
    }
updatePixels();
zoff += zincrement; // increment zoff
  println("right");
}

//draw sky
if(tiltU == 1){
    noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 1; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 1; y < height/3; y++) {
      yoff += 2.5*increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
 pixels[x+y*width] = color(135, 150, bright);
  }
  }
  updatePixels();
  zoff += zincrement; // increment zoff
  println("up");
}else if(tiltU == 0){
    noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 0; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 0; y < height/3; y++) {
      yoff += 5*increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
 pixels[x+y*width] = color(255, 255, bright);
  }
  }
  updatePixels();
  zoff += zincrement; // increment zoff
println("level");
}
//draw noise wave as landscape
if(tiltD == 1){
  fill(255);
  beginShape();
  float Wxoff = 0;
  for (float Wx = 0; Wx <= width; Wx += 10) {
    float Wy = map(noise(Wxoff, Wyoff), 0, 1, 200,300);
    vertex(Wx, Wy);
    Wxoff += 0.05;
  }
  Wyoff += 0.01;
  vertex(width, height/2);
  vertex(0, height/2);
  endShape(CLOSE);
  println("down");
}else if(tiltD == 0){
println("level");
}
}

Lab 12: Media Controller

Making the circuit for the controller was simple, but we took a long time to figure out the problem with our Arduino to Processing serial communication. All of the images were found on Google; the one that we ended up using is a photo of the Chillis, the band of the song that we played. For the sound and image, one potentiometer controls the volume and image tint. Originally, we tried to use a switch with an array three red tinted images, and a different potentiometer to control the speed of sound. Using the switch and the potentiometers was too complicated for one code, so we cut the Processing array of three images and used one.We ran into another problem with the potentiometers, seeing as we could only read the values of one potentiometer on the serial monitor. Four hours after the end of our lab block, we decided to use the same potentiometer for both the volume and the image-tint, and the other potentiometer to control the speed of the sound.

IMG_2630 (circuit without the switch)

For the serial communication from Arduino to Processing, I used the function Serial.print(value1); / Serial.print(“,”); / Serial.print(value2);, from my Processing code for the final, which I should have told my partner about. In Arduino, “sensorValue1” was the variable for input pin A0, the volume and tint, and “sensorValue2” was the variable for input pin A1, the speed. She spent a long time debugging the Arduino code, which used Serial.println(sensorValue1, sensorValue2);, so the potentiometer was not reading the values separately as they were written in the Processing code. I assumed that the issue was in the String function, since I originally used it for the array of different images. Looking back, I see in my original code that I didn’t create two separate variables for the different serial values, I just used the pinMode in the parameters for value = digitalRead() to update the values. Only when Marina changed the original parameters to Serial.println(sensorValue1 + sensorValue2), the monitor read more values between 0-255, so she realized that the issue was here. She fixed it by writing in the Arduino code Serial.print(sensorValue1); / Serial.print(“,”); / Serial.print(sensorValue2);. Below is the link to her video of it working.

Lab 12 – Media Controller

Lab 11: Motors

 IMG_2004     IMG_1999

My partner Frank had already built the arm of the drawing system, so I started building the H bridge circuit, which was relatively easy since we did it yesterday. I used the slides from our class presentation to check the circuit. At first, we forgot to connect the battery to the circuit, so the motor was moving gently. Once it was connected to the 5V power source, the motor worked perfectly. Below is a video of the drawing machine without the potentiometer, which is shown in the picture above.

Here it is with speed control:

 

 

H-Bridge Motor, from examples:

int motor1 = 7;
int motor2 = 8;

void setup() {
 pinMode(motor1, OUTPUT);
 pinMode(motor2, OUTPUT);

}

//the loop routine runs continuously 
void loop() {
  digitalWrite(motor1, HIGH); //turn the LED on (HIGH is the v
  digitalWrite(motor2, LOW); //turn the LED off by making the motor
  delay(2000); //wait for a second
 //reverse the direction that the motor spins
  digitalWrite(motor1, LOW);  //turn the LED on (HIGH is the v
  digitalWrite(motor2, HIGH); //turn the LED off by making the
  delay(2000);  //wait for a seconf
}

Stepper Speed Control, from examples:

#include <Stepper.h>

const int stepsPerRevolution = 200;  // change this to fit the number of steps per revolution
// for your motor


// initialize the stepper library on pins 8 through 11:
Stepper myStepper(stepsPerRevolution, 8, 9, 10, 11);

int stepCount = 0;  // number of steps the motor has taken

void setup() {
  // nothing to do inside the setup
}

void loop() {
  // read the sensor value:
  int sensorReading = analogRead(A0);
  // map it to a range from 0 to 100:
  int motorSpeed = map(sensorReading, 0, 1023, 0, 100);
  // set the motor speed:
  if (motorSpeed > 0) {
    myStepper.setSpeed(motorSpeed);
    // step 1/100 of a revolution:
    myStepper.step(stepsPerRevolution / 100);
  }
}

Final Proposal

Sound Maze

As I mentioned in my midterm documentation about the remote-controlled maze, other IMA students sharing their ideas about how to improve the system encouraged me to keep working on it. Here, you can see how I interpret interaction. When we pay attention to the experiences and ideas that we share, we gain something from those exchanges. IMA requires us to pay attention to the subtleties of those connections; it is important for us to understand the environments that we are a part of if we want to realize our potential impact on the things we encounter in daily life. In Interaction Lab, I have learned about the channels through which we communicate these ideas and create certain experiences with technology.

In a labyrinth, we pay attention to the walls that distance us from our destination. We try to navigate our way to the center through walls that look the same. Between the walls we find leading to dead ends and other paths. I think about time as something like a labyrinth: killing time in a maze reinforces my idea of time as fluid, not linear. I thought about making an animation that simulates this idea of time, but don’t want to create something with any more limits than the ones implied by the maze. This is how I decided to make an animation about noise, by distorting an example of Perlin noise. The idea of time made me think about the ways that we experience spacial elements of different objects. I realized that since the player will be looking down at the maze, they can only pay so much attention to the animation. Instead of using a purely visual animation, I want the relationship between Arduino and Processing to be more cohesive in this project by using sound. The noise that I make walking through a maze (made of man-made materials) is not the first thing that comes to mind when I think of a maze, but the sounds of my feet on the ground bouncing off the walls is heightened in a maze. Hearing footsteps getting louder and louder as someone approaches the center also compels us to focus more on hearing than sight. Not being able to see a route to the destination, for many, means more listening. This example of the way that our senses are allocated differently during a sensory experience helped me refine my idea for Processing. I want to create a similar feeling to a sensory experience like this. The animation shows movement through space with a textural noise simulation. As the tilt switch tracks the player’s motion in the physical maze, I am using a simulation of 2D noise in Processing as the backdrop to signify the movement of sound.

Lab 10: 3D Modeling

Necklace Powerpack: Design for Logitech Webcam

Schariar had the idea to make a necklace for the Logitech Webcam, so we designed an attachment to make a case to string a necklace through, like a camera strap. Because the Webcam has to be connected to a USB port, we planned the design around mobility and decided to make a power pack so that the user would not have to be tethered to a USB for the camera to work. Marcela said that we didn’t have to worry about making the neck strap, so Schariar designed the screen attachment while I worked on the power pack.

 

Screen Attachment

We started by taking measurements of the surface that the casing would cover, excluding the speakers and the screen. The case covers the top and bottom of the frame (measured in the picture on the left) and clips onto the back (measured in the picture on the right) in the gap between the speakers and the the stand on either side.

Link to Shariar’s Tinkercad Model: https://www.tinkercad.com/things/l4wrAYQUIOZ-logitech-webcam-case/editv2?sharecode=WLiNAgcBsJW4-Mh9-_mknkL0MfJ5vg868Ni25V5jTNU=

 IMG_1865   IMG_1864

 

Power Pack Attachment

For the power pack, Antonius told us to imagine that we were working with an engineer to make a bluetooth device to get rid of the issue that the USB posed; the camera requires 5V to work, so we could make a design that accommodates four AA batteries (~ 4 x 1.5V).

Screen Shot 2017-04-21 at 12.27.06 AMScreen Shot 2017-04-21 at 6.29.37 AMScreen Shot 2017-04-21 at 6.26.55 AM

The casing is much thicker than the screen attachment and 4mm wider (2mm on either side) than the width of the stand, so making the attachment was simple. Instead of a clip on attachment (shown in the picture on the left), I made one that slides on.

Screen Shot 2017-04-24 at 1.27.24 AMScreen Shot 2017-04-24 at 5.32.53 AMScreen Shot 2017-04-24 at 5.33.04 AMScreen Shot 2017-04-24 at 5.31.24 AM Screen Shot 2017-04-24 at 5.33.56 AM

 

Stamp Tutorial and Autodesk Trip

 

IMG_1633

 

 

 

I’m glad that IMA organized the trip to Autodesk – I had never heard of it before, and found a branch in NYC that I want to see if I can work at.

       IMG_1634

IMG_1655

This dinosaur skull made me think about using holes,instead of grouping separate objects, to make something I’ve been modeling in Tinkercad.

IMG_1646     IMG_1640

Illustrator:

I didn’t save the original stamp that I made using the NYUSH IMA stamp image in layers, so I did the tutorial again without the photo and combined it with a different illustration.

Screen Shot 2017-04-17 at 11.10.30 PM         IMG_1767

Screen Shot 2017-04-18 at 3.21.19 AM     IMG_1774

 

Screen Shot 2017-04-18 at 3.42.43 AM         Screen Shot 2017-04-20 at 10.59.16 PM

I pasted the figure, made mostly of curves and some lines, into a new layer. After I resized the figure to fit into the circles, I realized that the arm and the hand were disproportionately large, as you can see in the photo on the left; in the image on the right, you can see that I adjusted the proportions. There is a lot to improve on here, so I want to keep working on it and become more familiar with Illustrator.

Remote Controlled Maze

Materials:

Cardboard

Poster paint (on the maze)

Gouache paint (on the box)

Acrylic cross texture structure gel

2 wooden panels

Hot glue, tape

Styrofoam

Marble

2 tilt switch sensors

Distance sensor

Resisters

Jumper cables

Peter Maze Design     

Our plan for the maze initially was to have an animation activated by a distance sensor in the center of the maze, using the arrow keys on a keyboard and a motor pumping air to move the ball. We had to simplify these ideas to focus on making the system function on a basic level, which I took for granted; the process of simplifying this project gave me a greater appreciation for what is involved in the process of making something that seems simple actually work.

unnamed-2

Antonius gave us the idea to use a tilt switch instead of the keyboard, and Peter was committed to using servos. The resulting combination, which I credit to Peter, brought both ideas together (and made my idea of a handheld maze seem flat). We used 2 tilt switch sensors (one to move the upper and lower servos and one to move the servos on either side) to make a handheld remote control out of the breadboard, which was connected to the Arduino. To make the maze that Peter designed by himself, I drew it on a piece of cardboard in marker, cut the lines out, and glued one-inch wide strips of cardboard into them. I didn’t want any visible brushstrokes to appear on the maze as an unnecessary directional effect, so I used an acrylic cross structure texture gel to prepare the cardboard before I painted it. Peter made the Arduino code, which gave me a lot of time to waste on an animation that was too complicated to use for the project. We returned to the original idea of incorporating arrow keys instead, by translating them into a Processing code. Peter made the code for the arrow keys, which darkened in correspondence with the direction of the remote, and I found a code in Learning Processing (Chapter 17) for the text animation to appear when the distance sensor detected the ball at the endpoint.

Version 1

The servos were placed on the box supporting the maze in the corresponding length and width midpoints of the maze. Peter originally wanted the maze to be inside of a box, not on top of it, but the wooden panels I had used as the edges of the maze were too wide, so we flipped the box upside down and put the maze on top of it. We also lost our Styrofoam ball before presenting to the class, which was probably to our benefit because the live demo wouldn’t have worked.

unnamed-1

 

Version 2

Since the servo wings were not long enough to tilt the maze board so that the ball would move, I added Styrofoam to the wings to make them much bigger. This made the board tilt and increased the ball’s movement, but wasn’t a solution. Peter was skeptical about the strength of the Styrofoam, and with enough strips of tape, I convinced him that if we put the maze inside the box (as in his original plan) the longer wings would keep. Marcela used the wood cutter to cut the wood on the edges of the maze so that we were able to put the maze inside of the box. I cut a smaller box and put it into the original one so that the maze could move without falling to the sides. In this version, I tried changing the side to side and up and down movement of each servo in all different directions, and placed them on the corners of the box instead of on the midpoints.

IMG_1175

Version 3

I put the servos on the bottom of the maze itself, as Professor Cossovichput put it, “like a walking maze.” This solved the issue of the maze moving off of the edges of the board, but still didn’t quite work. Tony told me about a stronger servo model, but all of them were checked out. By this point, I had been testing several arrangements without paying attention the effect that my jerking hand movements had on the remote: the jumper cables kept sliding out, which was a problem during the show.

IMG_1178

Final Version(s)

I took the up and down servos off the bottom of the maze, and put the side-to-side servos onto the same wooden panel side with the starting point, taking Professor Cossovich’s advice to make the maze move in two directions instead of four. The side-to-side servos made the legs of the maze move up or down, so that players could move the ball on one side or the other. Throughout the presentation, players offered advice and suggestions, which made me cheer up about the maze’s dysfunction – at least it encouraged people to share their ideas.

IMG_1181

 

IMG_1189

This video shows Marina and Fernando playing: Fernando moved the servos into the middle of the maze, stacking them on top of each other with the wings of the servos facing opposite sides. This created the most movement, especially when the servos were placed onto the box below.

Stupid Pet Trick

IMG_0643

IMG_0644

Inside a box covered in dark fabric, I placed an ambient light sensor which controls the LEDs. The LEDs are supposed to turn off when the box is closed and to turn on when someone opens it. The ambient light sensor was on a separate breadboard, so that I could put it on the bottom of the box, the area that would be exposed to light first whenever the box was opened. My intention was to make the box so that once someone opens it, light would shine on the ambient light sensor so that the LEDs in the upper corner would turn on. My design had many flaws. I didn’t use long enough cables, so the ambient light sensor couldn’t even reach the bottom of the box. On the side, the cutout that I made for the jumper cables was too high up, so I had to make stands for the breadboard inside and for the Arduino outside. The jumper cables kept the curtain open, so the light was not fully blocked out. I hoped that this light source was the reason why the Ambient Light Sensor did not turn off the LEDs when the box was closed. I should have made a cover for the sensor itself, since it did turn off the LEDs when I covered it with my hands. Besides the fact that the LEDs did not turn off,  I had to hold the box shut with my hands. I tried to emphasize this flaw as part of the design by drawing cutout hands on the inside of the box, so that the hands holding the box open became a frame. 

IMG_0646

 

Materials:

Cardboard, hot glue, tape, satin, canvas, leather strings, lego pieces, sketch paper, paint

Arduino Uno, USB cable, 2 breadboards

ambient light sensor, 1K resistor

3 yellow LED, 3 220 resistors, 6 blue LED, 6 220 resistors

 

code: link to the lab diagram for an ambient light sensor https://www.dfrobot.com/wiki/index.php/DFRduino_Beginner_Kit_For_Arduino_V3_SKU:DFR0100#Ambient_Light_controlled_LED

Lab 6: Serial Communication

Part I: Fan with Processing

IMG_0785

IMG_0786

IMG_0788

Zayna had a code for the keyPressed() function saved, which we used to turn on the fans from our kits. Hers worked immediately; mine did not. I changed the serial port number in the processing code, from Zayna’s [2] “/dev/tty.Bluetooth-Incoming-Port” to [1] “/dev/cu.usbmodem1411,” and checked the baud setting (9600). The debugging process came to a simple conclusion…I realized that the fan was connected to the wrong pin holes in the Arduino.

Part 2: Drawing with Arduino

IMG_0793 IMG_0792

For this part, debugging was more complicated. Materials included a blue LED, 220 ohm resister, and a switch; missing from that list is the 10K resister we needed for the button. Zayna found a code for the button on the Arduino tutorial site, which we adapted to use with the blue LED (link: https://www.arduino.cc/en/Tutorial/Button). Seeing us struggle with string functions that Google searches that lead us to (link: https://learn.sparkfun.com/tutorials/connecting-arduino-to-processing), Antonius intervened. In the Arduino code, he had us check the values on the serial monitor by changing the line Serial.write to Serial.println; here, we saw that only the high digital value appeared whether or not the switch was pressed. Since the low value was not showing up at all, we figured that the communication gap must be related, so we included the the L value (72) in the code.

IMG_0795

Processing Code:

import processing.serial.*;
Serial myPort;
int val;
void setup() {
 printArray(Serial.list());
 // this prints out the list of all
 // available serial ports on your computer.
 myPort = new Serial(this, Serial.list()[ 1 ], 9600);
 // Change the Serial.list()[0] to your port
}
void draw() {
 // to send a value to the Arduino
 if (keyPressed) {
 myPort.write('H');
 } else {
 myPort.write('L');
 }
}

Arduino Code:


int val;

void setup() {
 Serial.begin(9600);
 pinMode(13, OUTPUT);
}

void loop() {
 // receive a value from processing
 while (Serial.available()) {
 val = Serial.read();
 }
 // test the value
 if (val == 'H') {
 digitalWrite(13, HIGH);
 } else if (val == 'L') {
 digitalWrite(13, LOW);
 }
 delay(10); // wait for Processing
} 

Lab 4: Ultrasonic Sensor

Materials:

  • 8 jumper cables M/M
  • 4 pin cable
  • USB cable
  • LM35 Temperature Sensor
  • Grove Ultrasonic Range Sensor
  • Arduino Uno

For the Grove Ultrasonic Range Sensor, I downloaded the sample code and added it to my Arduino library. Using jumper cables, I connected the 4 pin cable that came with the sensor to the Arduino (digital pin 7, power, ground, 5V). For the output, a temperature sensor was connected by a blue cable in the breadboard to the Arduino, in analog pin A4.

IMG_0559           IMG_0561

Even though the TX and RX LEDs flickered when the code was uploaded, I wasn’t sure whether it was working or not. Probably because I didn’t check the information terminal after I ran the code. Thankfully, my friend David intervened before I completely rewired the breadboard, since I assumed that the output must have been the problem. He pulled up the information terminal, which had been printing the distance. In the pictures above, we tested it by moving our hands to and away from the ultrasonic sensor; below are screenshots of the distances measured.

Screen Shot 2017-03-03 at 1.51.18 AMScreen Shot 2017-03-03 at 1.51.33 AM

/*
 * UltrasonicDisplayOnTerm.ino
 * Example sketch for ultrasonic ranger
 *
 * Copyright (c) 2012 seeed technology inc.
 * Website    : www.seeed.cc
 * Author     : LG, FrankieChu
 * Create Time: Jan 17,2013
 * Change Log :
 *
 * The MIT License (MIT)
 *
 * Permission is hereby granted, free of charge, to any person obtaining a copy
 * of this software and associated documentation files (the "Software"), to deal
 * in the Software without restriction, including without limitation the rights
 * to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
 * copies of the Software, and to permit persons to whom the Software is
 * furnished to do so, subject to the following conditions:
 *
 * The above copyright notice and this permission notice shall be included in
 * all copies or substantial portions of the Software.
 *
 * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
 * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
 * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
 * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
 * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
 * OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
 * THE SOFTWARE.
 */


/***************************************************************************/
//	Function: Measure the distance to obstacles in front and print the distance
//			  value to the serial terminal.The measured distance is from
//			  the range 0 to 400cm(157 inches).
//	Hardware: Grove - Ultrasonic Ranger
//	Arduino IDE: Arduino-1.0
/*****************************************************************************/

#include "Ultrasonic.h"

Ultrasonic ultrasonic(7);
void setup()
{
	Serial.begin(9600);
}
void loop()
{
	long RangeInInches;
	long RangeInCentimeters;
	
	Serial.println("The distance to obstacles in front is: ");
	RangeInInches = ultrasonic.MeasureInInches();
	Serial.print(RangeInInches);//0~157 inches
	Serial.println(" inch");
	delay(250);
	
	RangeInCentimeters = ultrasonic.MeasureInCentimeters(); // two measurements should keep an interval
	Serial.print(RangeInCentimeters);//0~400cm
	Serial.println(" cm");
	delay(250);
}