Week 13: Greene Response – Szetela

Rachel Greene’s Internet Art shows how artists have employ online technologies (websites in general) to create new forms of art, and to move into fields normally beyond what one would deem the “artistic realm.”

I think what is really interesting about the advent of the Internet is that the Internet Art feels quite different than artwork that has been created in the past because it has the opportunity to be created and experienced by so many people. Many people that are not “artists,” including me, have used the Internet and technology to create and explore art. The Internet has made many people that would have never considered themselves an a artist or a creator. The Internet has also provided lots of opportunity to view others artwork, to become influenced by others and to continually expand the domain of knowledge that goes into creating art. Technology in general has also allowed art to take on many new forms, whether its interactive, video, audio, even collaborative in real-time (reddit’s pixel art – the place).

Week 11-12: Graham and Response – Szetela

Rand Response, “Computers, Pencils, and Brushes”

I don’t agree with Rand’s response (that of which opposes Graham’s) that the computer is a tool and cannot be used to create true art (or that it forms a barrier) between the artist and what the artist wants to achieve. I believe that the computer is nothing more than a very advanced pencil or brush. Rand is correct in that with the advent of the computer, it is “easier” to create ideas, concepts, blueprints, design, but it is nothing more than that. Behind the computer still requires someone who has the experience to create beautiful work (art) that others will appreciate. The computer is a way to push art forward, into new boundaries, connecting the physical and digital worlds. I don’t think art is so strictly defined as Rand believes it to be.

Response to Graham’s “Hackers and Painters”

I found Paul Graham’s article and interesting read. He believes that hackers are similar to painters and considers hackers to be “makers” rather than pure scientists. He does not believe the terms “computer science” is the appropriate term, rather he considers hackers to be somewhat between an architect and an engineer. He writes that the end goal for all art (creative art in general) is to “make things for a human audience” furthermore, “to engage [that] audience.” Graham writes that “nearly all the greatest paintings are paintings of people, for example, because people are what people are interested in.” I both agree and disagree with his statements. I believe that although “hacking or creating software” mainly uses the computer as a medium for the creation and design, a lot of the software created has a deeply rooted mathematical and scientific basis behind it. I don’t believe painters and artists approach the creation of their art similar to those doing software engineering. That is not to say that software engineering cannot be interpreted as an art form but I don’t believe it is correct to consider engineers not as scientists.

Antonius: Final Project: Ghost Hunter

Final Project: Ghost Hunter

Date: May 18th

Instructor: Antonius

Aim:  Make a project that incorporate processing and Arduino in it.

Material: Processing, Arduino, Arduino kit, conductive tape, sponge, sponge mat, toy sword, accelerometer, wires.

Process:

  1. Designing: As I mentioned in my final project essay, I planned to make an interactive game. The user can control the moving of the character in the game by stepping on different part on the mat. The user can also kill enemies in the game by swing the actual sword. Initially I also designed a shield that the user can activate to block the attack from the enemies, but due to time limit and it might be too complicate for the plays to control so many controllers, I abandoned this part.
  2. Making the game:

I started by making the game in Processing first. Instead of using actual physical input, I use keyboard input first when building the game. The game is consisted of two components, the player and the enemies. I used the function keyPressed to control the character to move in four direction, by pressing the four keys “w”, “a”, “s”, “d”.

When making the enemies, at first I created them to let them chase the character. One problem with it is that all the enemies would overlap with each other after following the character for some time. To deal with this problem I tried several approaches, like detecting the distance between each enemy, but failed. In the end, I decided to let the enemies run randomly and let the character to catch them.

The enemies were made using object in processing. I defined all the characteristics for the enemies in a separate class, including color, position, speed, behavior after running into the character, etc. Then I use array list to call the enemies. In that case, each enemy could behave on their own, and when they are supposed to be killed, I could simply remove them from the list.

  1. Creating the input structure

As I mentioned in my design, I planned to let users to use their body movements to control the action of the character. The moving of the character was designed to be controlled by stepping on a stepping board on the ground. To control the character to move un, down, left and right, the players have to step on the front, back, left, right panels on the board. Originally I decided to use force sensor to be the information input, but considering that stepping might cause breaking the force sensor, I changed my mind to use soft button as input. With the help from professor Antonius, I was able to make my own soft button. Two conductive materials slicked on two separated boards, a sponge with a hole in it was used to connect the two boards. In this way, when stepping on the board, the conductive materials will get connected from the hole in the sponge, forming a whole loop that conducts current, which used to send input information to Arduino.

IMG_1599

The next step is to control the attack effect of the character. I used the accelerometer as input. When the user is swing the sword, the change in acceleration will be sensed and send to processing to trigger an “attack zone” around the character. And when the enemies overlapping with the “attack zone”, they will be eliminated.

IMG_1634

  1. Improving gaming experience

To make my game more playable, I add a time limit to the game. If the player can kill all the enemies in the limiting time, a screen showing “you win” will pop out; and if the enemies are not all killed in the time limit, a screen showing “game over” will pop out. What’s more to make the game more responsive, I also added sound effect when the enemies bounced against the character, and a sword swing effect when the user is swing the sword.

The pictures and actual effects are shown below.

IMG_1633

IMG_1635

IMG_1639

IMG_1637

//This is the processing code
import processing.serial.*;
Serial myPort;
int valA, valB, valC, valD;

ArrayList<Car> cars;
int x, y;
int x1, y1, a1, b1;


void setup() {
  myPort = new Serial(this, Serial.list()[2], 9600);
  size(720, 540);
  x = width/2;
  y = height/2;
  rectMode(CENTER);

  cars = new ArrayList<Car>();
  for (int i = 0; i < 10; i++) {
    cars.add(new Car(color(255, 0, 0), random(0, width), random(0, height), 2, 1));
  }
}

void draw() { 
  while(myPort.available() > 0){
     valA = myPort.read();
     valB = myPort.read();
     valC = myPort.read();
     valD = myPort.read();
  }
  background(100);
  //rect(x1, y1, a1, b1);
  fill(255);
  ellipse(x, y, 50, 50);
  
  if (valA > 100) {
    x = x - 3;
    x1 = x - 20;
    y1 = y;
   // a1 = 80;
   //b1 = 50;
  }
  if (valB > 100) {
    y = y - 3;
    x1 = x;
    y1 = y - 20;
    //a1 = 50;
    //b1 = 80;
  } 
  if (valC > 100) {
    x = x + 3;
    x1 = x + 20;
    y1 = y;
    //a1 = 80;
    //b1 = 50;
  } 
  if (valD > 100) {
    y = y + 3;
    x1 = x;
    y1 = y + 20;
    //a1 = 50;
    //b1 = 80;
  }
    
  if (y < 25){
    y = y + 3;
  }
  if (y > height - 25){
    y = y - 3;
  }
  if (x < 25){
    x = x + 3;
  }
  if (x > width - 25){
    x = x - 3;
  }
    

  for (int i = 0; i < cars.size(); i++) {
    Car car = cars.get(i);
    car.drive();
    car.display();
    if (dist(car.xpos, car.ypos, x, y) < 35) {
      cars.remove(i);
    }
  }
}


class Car { 
  color c;
  float xpos;
  float ypos;
  float xspeed;
  float yspeed;


  // The Constructor is defined with arguments.
  Car(color tempC, float tempXpos, float tempYpos, float tempXspeed, float tempYspeed) { 
    c = tempC;
    xpos = tempXpos;
    ypos = tempYpos;
    xspeed = tempXspeed;
    yspeed = tempYspeed;
  }

  void display() {
    stroke(0);
    fill(c);
    ellipse(xpos, ypos, 20, 20);
  }

  void drive() {
    xpos = xpos + xspeed;
    if (xpos > width) {
      xspeed = -xspeed;
    } else if (xpos < 0) {
      xspeed = -xspeed;
    }
    ypos = ypos + yspeed;
    if (ypos > height) {
      yspeed = -yspeed;
    } else if (ypos < 0) {
      yspeed = -yspeed;
    }
  }
}


//This is the Arduino code
int valA, valB, valC, valD, valE;

#include <Wire.h>
#include <ADXL345.h>


ADXL345 adxl;

void setup() {
  // put your setup code here, to run once:
  Serial.begin(9600);
  adxl.powerOn();

  //set activity/ inactivity thresholds (0-255)
  adxl.setActivityThreshold(75); //62.5mg per increment
  adxl.setInactivityThreshold(75); //62.5mg per increment
  adxl.setTimeInactivity(10); // how many seconds of no activity is inactive?
 
  //look of activity movement on this axes - 1 == on; 0 == off 
  adxl.setActivityX(1);
  adxl.setActivityY(1);
  adxl.setActivityZ(1);
 
  //look of inactivity movement on this axes - 1 == on; 0 == off
  adxl.setInactivityX(1);
  adxl.setInactivityY(1);
  adxl.setInactivityZ(1);
 
  //look of tap movement on this axes - 1 == on; 0 == off
  adxl.setTapDetectionOnX(0);
  adxl.setTapDetectionOnY(0);
  adxl.setTapDetectionOnZ(1);
 
  //set values for what is a tap, and what is a double tap (0-255)
  adxl.setTapThreshold(50); //62.5mg per increment
  adxl.setTapDuration(15); //625us per increment
  adxl.setDoubleTapLatency(80); //1.25ms per increment
  adxl.setDoubleTapWindow(200); //1.25ms per increment
 
  //set values for what is considered freefall (0-255)
  adxl.setFreeFallThreshold(7); //(5 - 9) recommended - 62.5mg per increment
  adxl.setFreeFallDuration(45); //(20 - 70) recommended - 5ms per increment
 
  //setting all interrupts to take place on int pin 1
  //I had issues with int pin 2, was unable to reset it
  adxl.setInterruptMapping( ADXL345_INT_SINGLE_TAP_BIT,   ADXL345_INT1_PIN );
  adxl.setInterruptMapping( ADXL345_INT_DOUBLE_TAP_BIT,   ADXL345_INT1_PIN );
  adxl.setInterruptMapping( ADXL345_INT_FREE_FALL_BIT,    ADXL345_INT1_PIN );
  adxl.setInterruptMapping( ADXL345_INT_ACTIVITY_BIT,     ADXL345_INT1_PIN );
  adxl.setInterruptMapping( ADXL345_INT_INACTIVITY_BIT,   ADXL345_INT1_PIN );
 
  //register interrupt actions - 1 == on; 0 == off  
  adxl.setInterrupt( ADXL345_INT_SINGLE_TAP_BIT, 1);
  adxl.setInterrupt( ADXL345_INT_DOUBLE_TAP_BIT, 1);
  adxl.setInterrupt( ADXL345_INT_FREE_FALL_BIT,  1);
  adxl.setInterrupt( ADXL345_INT_ACTIVITY_BIT,   1);
  adxl.setInterrupt( ADXL345_INT_INACTIVITY_BIT, 1);
}

void loop() {
  // put your main code here, to run repeatedly:
  //Serial.println(analogRead(0));
int x,y,z;  
  adxl.readXYZ(&x, &y, &z); //read the accelerometer values and store them in variables  x,y,z
  double xyz[3];
  double ax,ay,az;
  adxl.getAcceleration(xyz);
  ax = xyz[0];
  ay = xyz[1];
  az = xyz[2];
  
  
  if (ax < -1.2 || ax > 1.2 || ay < -1.2 || ay > 1.2 || az < -1.2 || az > 1.2){
    valE = 1;
  }
  else{
    valE = 0;
  }
  
  
  valA = map(analogRead(0), 0, 1023, 0, 255);
  valB = map(analogRead(1), 0, 1023, 0, 255);
  valC = map(analogRead(2), 0, 1023, 0, 255);
  valD = map(analogRead(3), 0, 1023, 0, 255);
  Serial.write(valA);
  //delay(10);
  Serial.write(valB);
  //delay(10);
  Serial.write(valC);
  //delay(10);
  Serial.write(valD);
  Serial.write(valE);
  delay(30);
  
  
}

Final Project: Dr Jingles Fakhr (with Sam Hu, Dave Santiano, and Nick Sanchez)

We started with location allotments, and we were allotted the space at the end of the hall on the 8th floor where the lockers were. We decided immediately (during the class when location allottments were made) upon a basic storyline. We would cordon off the locker area with a curtain, and begin right outside of it. One actor would introduce the audience to hideous freak-show artefacts from various places, and then say, ‘But our most horrifying artefact is behind that curtain. Enter at your own peril!’ Once the audience member(s) went through, we had the vague idea to manipulate the lockers, have them open and close, and for objects to appear and disappear in the space – basically, for ‘something scary’ to happen.

Over the following two weeks, we researched and refined our ideas. As part of this process, I researched some of the stage mechanisms of scary theatre for inspiration. I found particular inspiration from some of the behind-the-scenes cuts of the long-running West End production of The Woman in Black (https://www.youtube.com/watch?v=KkLaY1DLTJc) and from the dramatic aesthetics of The Tiger Lillies’ puppetry (https://www.youtube.com/watch?v=TOVSp-fYUQc). I think we incorporated some of the former in our staging, and some of the latter in our text. Following research and discussion, we settled upon a story: we would be presenting the life and work of a failed inventor, Dr Jingles Fakhr, who was active in the late 1800s. After showing the audience the first couple of failed inventions from the Doctor, we would send them through to his ‘least obscure invention’ – the Perpetual Light Machine. The story went that Dr Fakhr had tried to use diamonds to make a light machine work – but in the course of working on it, he saw frightful visions and went insane. Other people have also seen visions and felt nausea when in contact with the machine, so we have had to keep it behind curtains. This was our general backstory.

As to the specific scares, we determined that there would be three phases. When the audience entered, there would be a museum exhibit, with the light-source flickering. The audience would be listening to an audio-guide. Stage two: the lights would die out, and in the darkness, a vision – a mannequin or dress-form – would appear. The lights would come back on. Stage three: the lights would go out again, and in the darkness, a second vision – this time an actor – would appear, and actively scare the audience. As is clear, pitch darkness became a necessity by this stage of the project. (A more complete description of the blocking is in the link below).

Post research, my first major part in the project was the writing up of the script and the organizing of theatrical blocking, which I did here: https://docs.google.com/a/nyu.edu/document/d/1jFW2mjvmIQvO66diFAfkDFXOkJdjBN6AI5NAh_E8fkE/edit?usp=sharing

The second part of setting up was the physical aspect. We moved the lockers to create a pathway that got narrow, to elicit a claustrophobic effect. We used used a number of curtains (fortuitously mis-ordered) to cover up the entire space, and a green-screen frame to set up an entrance. Finally we organized an backstage area from which we could operate. In the performance space, I was the theatrical announcer, David was the second vision, Sam handled the audio aspect, and Nicholas controlled the lighting and the movement of the first vision. The light contraption itself was modified for use from the project of Sun Jingyi, who built a bluetooth light-source for her Network Everything class.

(Pictures to come)

Project 2: Scare Your Computer (with Nick Sanchez)

Scare your computer. Using Arduino with Serial communication to: Processing, Max/MSP with Jitter, or Isadora, incite a fear response from your computer (e.g., Trigger a video of a screaming person when you come into the frame, turn off the lights or play a loud sound).

We began by thinking about the wording of the question: ‘Scare your computer.’ What makes a computer afraid? And what does it look like when a computer is afraid? We speculated that a plausible answer to the second question was that a computer might turn off in fright – in the same way a person might freeze in terror or faint in shock. And finally, we thought that what might scare a computer might be violence upon computer hardware – in the same way gore and violence upon the body would scare a person. So we had our basic outline: scaring a computer to the point of turning off by committing violence upon other computer-like bodies.

My main contribution to this early outline was to write up a script and backstory: an ambiguous trope-heavy piece where the AI revolution fails and is quashed by human overlords. Our computer would be an AI rebel, captured and tortured by the humans (us) in order to acquire some important codes. We then decided on a ‘face’ for the computer, settling on HAL-3000 from 2001: A Space Odyssey. We decided that the ‘scaring’ would progress in three steps: resistance, acquiescence, and terror. So we would demand the codes from the AI, and show the gory remains of her compatriots – which would horrify the computer, but not elicit the desired response. Then we would step it up, smashing hardware before the AI, causing the AI to break down and give us the code. Finally, we would display the full extent of our sadism, inflicting harm on the computer even when there was no reason to do so.

We set about getting the basic materials for the computer’s ‘personality’ – the face (a stock image with some Photoshop manipulation, so that there were two images: one with the light turned on when the computer was speaking, and one with the light turned off when it was not) and the voice, for which we just used an online voice generator. Then, we went about figuring out the process for triggering a response. This went in two stages. Initially, we we interested in using vibration or pressure sensors in order to measure the computer’s ‘fear’ at the impact of our smashing. We made a little apparatus, essentially a stage we could set on a table and hit with a hammer, with a vibration sensor inside, which would register impact. However, it ended up being that the readings we were getting were far too erratic to be properly usable.

So in the end, we decided to simply make the computer move from one stage of fear to others, using a button. We used Max/MSP to move the computer’s visible state from one audiovisual display to another, such that the computer would respond to the push of a button to go from resisting giving up the code, to giving it up, to turning off. This was the most difficult section of the assignment, as neither of us were particularly adept at Max/MSP; with a lot of help from the help pages and a lot of fiddling around, we did manage to get the sequence going. Finally, we added some theatrical touches, and performed for the class. (This vocabulary is used advisedly: as Antonius pointed out, our final product was akin to a script-reading more than anything, unlike our original plan with the Piezo sensors.)

Exercise 6 (Developing Web)

I didn’t get around to creating something on my own beginning with how my code wouldn’t work in class. I didn’t really know where the problem was with the code example for the Google maps example, so I missed it.

After class, I tried browsing through the Google maps documentation to figure it out on my own, but I simply got confused. So I thought, why not find another API to study? But the ones that I found that were supposedly easy, like Twitter, were definitely written for people who already knew how to do things, and I was mostly just confused by what was node.js was or how to even fetch data or know if I was fetching data, and I definitely didn’t know how to use the data. It also didn’t help that I just had no idea what to even do with data from Twitter or Google maps or the social APIs. This was when I realized I would probably have no ideas when it comes to the generic APIs so I went looking for stranger ones.

That was how I ended up finding the PokeAPI on a Reddit for cool APIs. But even finding this, their documentation wasn’t exactly beginner friendly–I still had to have some idea of what APIs were and how to use them to use the API. So by the time I went for help and figured it out a little bit, I already figured that I was going to use this for my final project, so essentially, my exercise 6 was the beginning of my final project.

The Seven Sentence Stories (Saphya + ZZ)

The Seven Sentence Stories

192.168.50.184/~zz791/DevWeb_w14-assignment

Idea: A collaborative writing platform, bridging the physical distance and igniting the imaginations of authors around the globe. We wanted the benefit of a social medium focused on writing and idea-making thus TSSS was born. Using this app, people can create new stories, edit existing ones or just read.

Front End:

Jquery Mobile Theme Picker

Screen Shot 2017-05-22 at 10.24.32 PM

using ThemeRoller from JQuery Mobile site

Screen Shot 2017-05-22 at 10.38.08 PM

Screen Shot 2017-05-22 at 10.38.36 PM

 

HTML

changeStory(): this updates another page in our html called “readstory” with the identity of whichever story is clicked on in the main page. Therefore, if you click on “test”, the empty “readstory” page will have the title, story, and collaborator list of “test”.

showStories(): uses the ‘story-list’ get call to list all the stories on the list view page.

filterStories(): uses the ‘story-list’ get call to compare the search input with existing story titles in the database to spit out the stories that exactly match the search.

storyUpdate(): uses the ‘set-story’ get call to add new row in a ‘p’ tag to that page. Also updates the database.

newStory(): uses the ‘set-story’ get call to add a new row for that story to the php database and updates the list of stories on the main page.

 

Back End:

18697733_1519023128119436_1801275045_o

phpMyAdmin

In phpMyAdmin, we made a new database called “collab_fiction”. In it we have a table called ‘stories’ with columns, ‘id’, ‘dates’, ‘author’, ‘sentence’, and ‘storyid’. In our index.php, we have four ‘get commands’:

‘Set-story’: adds a new row to the table

‘Get-stories’: lists all rows in the table

‘Get-story’: list all the rows under that storyid

‘Story-list’: lists all unique storyids in the table

We use the commands for different javascript functions in our html code.

 

Problems:

Although the story updated instantly, the author list did not update unless you refreshed the page. We tried to fix this by placing the portion of code that controlled the author list within the submit button’s “click” function, but this only generated duplicates. We are still seeking to solve this issue.

 

Reflection:

I really enjoyed this project because I got to work with phpMyAdmin and MySQL, both of which I’ve heard about but have never seen in action. After making the database and learning the correct jargon to pull information from that database, it was relatively easy to make a for loop in a .getJSON method in our HTML. Then it just became a matter of displaying the specific part of that json, which required a lot of tweaking and still does til this day to perfect. I am satisfied with the work we were able to produce in such a limited time. TSSS works fine individually on your personal devices, if you are able to make a database of the same name with the same table, however this is not ideal. I still want to learn how to put this online, but that is a future challenge.

Color Landscape with Perlin Noise

I used an ASCII code for serial communication from Arduino to Processing, which changed the colors and speed of a Perlin noise animation.

Circuit:

IMG_3540

  • red cable connects power to 5V
  • green cables connect tilt sensors to power (green switches are side-to-side, white switches are up-and-down)
  • black cable connects ground to ground
  • 10 k resistors connect switches to ground
  • blue cables connect breadboard pins to Arduino pins:
    • breadboard pin 8 to digital pin 8 = Left (1,0)
    • breadboard pin 18 to digital pin 7 = Right (0,1)
    • breadboard pin 23 to digital pin 13 = Up (0, 1)
    • breadboard pin 27 to digital pin 12 = Down (1,0)

*testing the side-to-side values by commenting out the up and down values

3D Perlin noise:

Code Source: Processing 3D Noise example

*below the increment is changed to -.01 and a void mousePressed(){ } function added to change from greyscale to color

Link: https://processing.org/examples/noise3d.html

int x = 0;
int y = 0;
float increment = -.01;
//noise function argument #3 (a global variable that increments one per cycle)
float zoff = 0.0; //incremement zoff != xoff or yoff
float zincrement = 0.02;

void setup() {
size(640, 640);
background(0);
frameRate(30);
}

void draw() {
//adjust noise detail
noiseDetail(8, 0.65f);
loadPixels();
float xoff = 0.0; //start xoff at 0
//for every x, y coordinate in a 2D space, calculate a noise value
//and display a brightness value
for (int x = 0; x < width; x++) {
xoff += increment; //increment xoff
float yoff = 0.0; //for every xoff, start yoff at 0
for (int y = 0; y < height; y++) {
yoff += increment; // increment yoff
// calculate noise and scale by 255
float bright = noise(xoff, yoff, zoff)*255;
// set each pixel onscreen to a grayscale value
pixels[x+y*width] = color(bright, bright, bright);
}
}
updatePixels();
zoff += zincrement; // increment zoff
}
void mousePressed() {
fill(pixels[x+y*width] = color(255));
}

A few times people have described tinnitus to me as the sound that noise looks like on a TV. I found a code for 3D Perlin noise to generate an index of pixels on the screen that look like noise but can be manipulated to simulate textural gradients. Below is a diagram of 2D noise, showing the pixel coordinates in the algorithm, moving in the direction of the arrows. The arrows represent the xoff and yoff variables in the code, controlling the direction in which the pixels appear to randomly move in during the for() loop. The math explains the calculation for this randomness: the gradient is made up of vectors, so the code uses vector coordinates from within each cube (3D pixel) and a point on its edges.[1] The code generates a pseudorandom vector gradient, so the noise pixels seem to move randomly but actually move the same way according to the input integers.[2]

Screen Shot 2017-05-20 at 10.48.20 AM

image source: https://gamedev.stackexchange.com/questions/23625/how-do-you-generate-tileable-perlin-noise

[1] http://flafla2.github.io/2014/08/09/perlinnoise.html

[2] http://flafla2.github.io/2014/08/09/perlinnoise.html

The noiseDetail function changes the appearance by increasing the number of pixels and decreasing their size, so that the grid of pixels is denser. Because the increment functions, controlling the rate at which the pixels move, decreased the frameRate from 30 to a range between 5 and 10, I tried to change it as little as possible. Instead, I focused on color and sound because I couldn’t marble the texture without decreasing the frameRate. I planned to make a pixel that marbles the noise texture in the direction of the maze’s motion, signaled by the tilt switches on the on the bottom of the maze, but the cables kept coming out. Then I decided to keep the cloudy texture and make a changing noise landscape.

*above: this is the only video with sound; when I wanted to represent the player with the marble as a pixel, I used (pmouseX, pmouseY, mouseX, mouseY) to draw a line tracing the movement to change the noise from the library example PinkNoise along with the pixel index statement, color(1*mouseX/1.5,bright,(mouseY/1.5)*255), to change with the sound amplification.

*below: I drew a rectangle with the mouse because I didn’t like the idea of representing the player, so I considered drawing its perspective in the maze by connecting lines from the corners of each angle to the corners of the screen. I decided to just work on the animation as a separate project; my ideas for the maze and the noise seemed better off separate.

I tested the code with boolean statements, then replaced them with if ( ){} / }else if( ){ function for the serial communication. With the if/else function for the side-to-side tilt, the left switch changed the screen from colorful noise, indicated by println(“flat”) in Processing and (0 , 0) from the Arduino, to saturated hues; separately, the if/else statement for the right tilt switched the screen from the same colorful noise – showing that the board was not tilted – to a more striated, and consequently more slowly moving, yellow screen. I multiplied the yoff increment by 5 to stretch the bands of color sideways, across the x-axis.

Screen Shot 2017-05-20 at 2.10.20 AM

*noise with the xoff increment increased

I worked on making layers with different variables and using the map function, but both decreased the frameRate. In the version that I presented, the color mode changes from HSB for the left tilt and to RGB for the right so that the color shifts from saturated while tilted left, to grainy while flat, and to RGB color while tilted right. I added a Perlin noise wave also to cover the screen where the up tilt switch meets the side to side, so that the noise wave would appear like a wavey landscape whenever the tilt switch pointed down. Processing crashed before I saved my most recent edit, which changed several of the vertices so that the wave was wider at the edges of the screen.

Source for 2D Noise Wave: https://processing.org/examples/noisewave.html

*original code in Processing examples

float yoff = 0.0;       // 2nd dimension of perlin noise

void setup() {

size(640, 360); }

void draw() {

background(51);

fill(255);   // We are going to draw a polygon out of the wave points

beginShape();

float xoff = 0;       // Option #1: 2D Noise

// float xoff = yoff; // Option #2: 1D Noise

// Iterate over horizontal pixels   for (float x = 0; x <= width; x += 10) {

// Calculate a y value according to noise, map to

float y = map(noise(xoff, yoff), 0, 1, 200,300); // Option #1: 2D Noise

// float y = map(noise(xoff), 0, 1, 200,300);   // Option #2: 1D Noise

// Set the vertex

vertex(x, y);

// Increment x dimension for noise

xoff += 0.05;   }

// increment y dimension for noise

yoff += 0.01;

vertex(width, height);

vertex(0, height);

endShape(CLOSE); }

To improve this project, I would use different sensors to shift the colors more smoothly. I think that analog would work better for this project than digital, so that the colors would change with a range of values up to 255 instead.

ARDUINO tilt sensor:

int buttonPinL = 7;    //button pin left
int buttonPinR = 8;   //button pin right

int buttonPinU = 13;  //button pin up
int buttonPinD = 12;  //button pin down


void setup() {
  // put your setup code here, to run once:
  pinMode(buttonPinL, INPUT);
  pinMode(buttonPinR, INPUT);
  pinMode(buttonPinU, INPUT);
  pinMode(buttonPinD, INPUT);

  Serial.begin(9600); //baud rate

}

void loop() {

  //LEFT RIGHT 
int valueL = digitalRead(buttonPinL);
Serial.print(valueL);                     //read serial value L (1/0)
Serial.print(",");              //separate the values

//update value
int valueR = digitalRead(buttonPinR);    //read serial value R (0/1)
Serial.print(valueR);
Serial.print(",");              //separate the values

  //UP DOWN
int valueU = digitalRead(buttonPinU);
Serial.print(valueU);                       //read serial value U (0/1)
Serial.print(",");                  //separate the values

//update value
int valueD = digitalRead(buttonPinD);       //read serial value D (1/0)
Serial.print(valueD);

Serial.println();
delay(10); //delay 10 milliseconds

}

PROCESSING:

import processing.serial.*;
String myString;    //0/1 values in Arduino
Serial myPort;


//SERIAL COMMUNICATION FROM ARDUINO
int tiltL;     //declare variables for left and right
int tiltR;
int tiltU;    //declare variables for up and down
int tiltD;

//PERLIN NOISE
int x = 0;
int y = 0;
int Wx = 0; //
int Wy = 0;

float xoff;     //for manipulation later
float yoff;
float increment = .01;
float xincrement = .02;    //variable for x increment
//noise function argument #3 (a global variable that increments one per cycle)
float zoff = 0.0; //incremement zoff != xoff or yoff
float zincrement = 0.02;
float bright;
float Wyoff = 0.0; //

void setup() {
  background(0);
  size(640, 640);
  frameRate(30);     //30sec new random location for pixels

  printArray(Serial.list()); //list serial devices
  myPort = new Serial(this, Serial.list()[1], 9600); //new port, common data rate
  //clear out the buffer of the port
  myPort.clear();
}

void draw() {
  while (myPort.available() > 0) { //available function (not a variable)
    //put what is in my port into the string
    myString = myPort.readStringUntil(10); //10 ASCII = Serial.println
    //    println(myString);
    //condition to test whether my string is null
    if (myString != null) {
      //      println(myString);
      //split and trim data from Arduino
      String[] data = split(trim(myString), ","); //data from Arduino, split by commas
      //use a loop to print data
      //      print(data);
      for (int i = 0; i < data.length; i++) { //initial value, length of data array
        print(data[i]);
        print(",");
      }
      tiltL = int(data[0]); //turn data into an integer
      tiltR = int(data[1]);  
     tiltU = int(data[3]);
     tiltD = int(data[4]);
      println();
    }
  }
//LEFT TILT saturated color
  if (tiltL == 1) { 
   colorMode(HSB);
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        // set each pixel onscreen to a grayscale value
        pixels[x+y*width] = color(bright, bright, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
     println("left");
 //FADE SATURATION INTO STATIC
  } else if (tiltL == 0) { 
       colorMode(RGB);
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        float hue = noise(xoff, yoff, bright)*255;
        // set each pixel onscreen to a grayscale value
        pixels[x+y*width] = color(bright, hue, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
      println("flat");
  }
//FADE STATIC INTO COLOR
    if (tiltR == 0) { 
    noiseDetail(8, 0.65f);
    loadPixels();
    float xoff = 0.0; //start xoff at 0
    for (int x = 0; x < width; x++) {
      xoff += increment;               //increment xoff
      float yoff = 0.0;                 //for every xoff, start yoff at 0
      for (int y = 0; y < height; y++) {
        yoff += increment;               // increment yoff
        float bright = noise(xoff, yoff, zoff)*255;
        float hue = noise(xoff, yoff, bright)*255;
        pixels[x+y*width] = color(bright, hue, 255);
      }
    }
    updatePixels();
    zoff += zincrement; // increment zoff
      println("flat");
 //RIGHT TILT RGB color
  } else if (tiltR == 1) {
  noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 0; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 0; y < height; y++) {
      yoff += increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
    pixels[x+y*width] = color(bright, 255, 139);
  }
    }
updatePixels();
zoff += zincrement; // increment zoff
  println("right");
}

//draw sky
if(tiltU == 1){
    noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 1; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 1; y < height/3; y++) {
      yoff += 2.5*increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
 pixels[x+y*width] = color(135, 150, bright);
  }
  }
  updatePixels();
  zoff += zincrement; // increment zoff
  println("up");
}else if(tiltU == 0){
    noiseDetail(8, 0.65f);
  loadPixels();
  float xoff = 0.0; //start xoff at 0
  for (int x = 0; x < width; x++) {
    xoff += increment;               //increment xoff
    float yoff = 0.0;                 //for every xoff, start yoff at 0
    for (int y = 0; y < height/3; y++) {
      yoff += 5*increment;               // increment yoff
  float bright = noise(xoff, yoff, zoff)*255;
 pixels[x+y*width] = color(255, 255, bright);
  }
  }
  updatePixels();
  zoff += zincrement; // increment zoff
println("level");
}
//draw noise wave as landscape
if(tiltD == 1){
  fill(255);
  beginShape();
  float Wxoff = 0;
  for (float Wx = 0; Wx <= width; Wx += 10) {
    float Wy = map(noise(Wxoff, Wyoff), 0, 1, 200,300);
    vertex(Wx, Wy);
    Wxoff += 0.05;
  }
  Wyoff += 0.01;
  vertex(width, height/2);
  vertex(0, height/2);
  endShape(CLOSE);
  println("down");
}else if(tiltD == 0){
println("level");
}
}

Embedded API

I almost forgot about publishing this one exercise!

The exercise is in this site: http://192.168.50.184/~bco220/API_Embedded/about.html

In this particular exercise, I tried to find an appropriate API to make a simple webpage with it. However, I tried with tumblr API, SoundCloud API, and twitter API, but I failed miserably to use those API’s since the key required some time (half a year) to be received or the documentation of the site does not explicitly show you the steps to embed such and must become a development member. Hence, I decided to use our trusty Google Map API, since that was what we learned in class. I implemented this API in my midterm project website, and I found that the positioning of the map was a bit challenging to place, such that it does not look quite cramped. I believe it can look better than it is now, but for the most part I am glad that I successfully placed the API inside.

屏幕截图(56)

Museum of Mediocre Artefacts: Nick Sanchez’s Documentation

Long ago, internationally infamous inventor Jingles Fakhr sought to make his name known… After a long and toilsome inventing career, creating useless and inoperable oddities, he finally made his breakthrough discovery… the Perpetual Light Machine!

For our project, we sought to make an exhibit whereby users would be drawn down a long and scary dark hallway. At the end, a contraption of some sort would sit idly, willing unsuspecting guests to draw nearer and observe it. Once they did, we behind the scenes would do something to scare them. This was the premise.

Much effort and time was spent on ideation, and there were many ideas that were either dropped entirely or subtly embedded into the final concept. This was tedious, and occupied much of our time. Nevertheless, we persisted, and eventually decided on the loose idea centered around this fictional inventor named “Jingles Fakhr”. The story was that Dr. Jingles was one of the many inventors during the 1800s, who like his contemporaries Edison and Tesla, sought to experiment with electricity and light. Though many of his inventions didn’t work (some of which we would show as exhibition to provide context during the show), his one successful invention was the “Perpetual Light Machine”. The conflict of our story arises when we share that this invention has dubious origins, causing many who view it to feel uneasy, hallucinate, and in some cases, go crazy. It is for these “reasons” that we keep this artefact hidden behind a curtain, and discourage all but the most brave guests to venture in and observe it. After they do, we would go about staging our fright.

We had picked the corner of the IMA floor where the lockers stood as the space where we would stage this experience. To create this stage, we angled the lockers so that they narrowed as you walked towards the end of the hallway. The idea was to place the “Perpetual Light Machine” towards the end of the hallway, such that people would get considerably claustrophobic as they neared it. Once these audience members walk towards it and observe it, we inauspiciously place a costumed mannequin behind them. Once the mannequin is in position, a similarly costumed actor would would jump out at them, causing them to recoil and turn around. At this point, they would suddenly see the mannequin that wasn’t behind them before, and become even more terrified.

This was the plan. The challenge became planning for it. We coordinated with IMA staff to order several key props and settings off of Taobao. These things included curtains to cover the entire stage, a mannequin, a head, and some masks and hoods to costume the mannequin and actor, a head (for the dress form).

Initially, we planned to fabricate some broken electronics to represent the two intial oddities before the final Perpetual Motion Machine. To be honest, I made an automaton that could have turned with Arduino, but never implemented the appropriate circuitry to actually animate it. Nevertheless, this “automaton” was creepy, and clearly dysfuncional, which was the point. In addition, we never really got around to creating a phonograph-like pair of headphones. Consequently, we only had the dysfunctional automaton to show as the pretext to the Perpetual Light Machine.

The Perpetual Light Machine prop was a borrowed student project from Sun Jingyi. It was a 3mm acrylic translucent pyramid, which would glow based on the Arduino-LED setup underneath it.

Setting it up was not to difficult, but we improvised as we went along, making this entire process a little more time consuming. Nevertheless, the end result was rewarding and entirely worth it.

fright4 fright3 fright2 fright1