Final Project

CONCEPTION AND DESIGN:

The intended users and intended influence of our project are designed to be people who just want to take a second to relax and have fun and be able to do so anywhere anytime. So we want to take the procedure simple and convenient for the users but focus more on the designing of the animation and frames. Additionally, in order to make the storyline behind the game more coherent and understandable, we’ve tried different ways to make sense of it logically. Inspired by the gamepad of the Switch, we used several buttons trying to recreate a handle and also tried to use the perimeter to control the image. However, it turns out not as good as we’ve imagined during the user testing session. Also, even though we’ve considered making more complicated interactions like using the pressure sensor or the motion sensor to control the image, but we think in the end, it would be away from our original intention of making the game simple to play just for a little break. However, by using a joystick, we can mimic a    game handle, which combines direction controlling and button pressing together and keeps small and easy to carry.

 

FABRICATION AND PRODUCTION:

The most significant steps in the production process in terms of failures are trying to solve the “So what” problems. Since we want to stand in the users’ perspective, we want to make sure the procedure makes sense to the users. However, during the user testing session, we showed the users of our first part of the game, which is controlling the rabbit to the moon. But we received many questions related to the “So what” problem, for example, what is after the success and why using the rabbit and the moon. Based on these feedbacks, I thought we should add some introduction to the background story and create an ending animation or post. Also, we decided to improve it to a two-steps game and add more dimensions. However, even though we successfully created the second step, it still seems confusing to proper who don’t have the background knowledge of the traditional story. And in terms of success, the most significant step is figuring out how to restrict the rabbit in the moon once it entered so it won’t influence the next game. Also, the set-up of the point losing when touched by the rocks and even the pieces of the rockets is a difficult part to achieve but I’m glad we can achieve that at the end.   

CONCLUSIONS:

The goals of our project are to create an easy game that allows people to play in a convenient way to relax and have fun. Additionally, we want to bring the traditional Chines myth into the context and create the game based on that.

In terms of my definition of interaction, I firstly consider it as one part’s reaction or response to another part, which should be psychical, to satisfy one part’s need and have an effect upon one another. However, my project results not align with the psychical reaction since it focuses more on the response on the screen. Because during the research process, I started to realize that interaction should not be restricted to the psychical reaction, but it’s in many dimensions. It does not necessarily need to be communication but a feedback of the operation of a computer can also be considered as interaction. For the 2-way effect aspect, we think we followed my definition.

For our project, we want users to join a visual game by controlling a joystick to move its position and hit the button to shoot the bullets to win the challenge. Our plan for future improvement will be adding more background introduction and create an ending animation indicating the rabbit and the fairy finally get back together on the moon. Also, we can add a new option of a motion sensing game, which may make the game more entertaining and special that the regular phone game. Also, we thought about adding more sound effect to make the game more accomplished. 

So although we want to primarily consider the users’ perspective during designing, the result turned out confusing some of the users. Therefore, even though they can still play the game, it’s indifferent with any other phone game and lost its specialty. And my takeaway from it is that sometimes you can not assume the users’ response but you need to really let the user test the project and hear about their feedback. 

import ddf.minim.*;
AudioPlayer player;
Minim minim;

float secondsRadius;
boolean gameStarted = false;

int elapsedTime1;
int elapsedTime2;
int time;
PImage rabbit;
PImage moon;
PImage astroid;
float b;

int x1;
int y1;
int dis;

import processing.serial.*;

String myString = null;
Serial myPort;
int NUM_OF_VALUES = 2;   /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/
int[] sensorValues;   


ArrayList<shot> shots = new ArrayList<shot>();
ArrayList<astroid> astroids = new ArrayList<astroid>();

// Settings - how many seconds between each new astroid (3 seconds = 3 * 60)
int astroid_rate = 2 * 60;
int astroid_count = 0;
// Size in pixel of nominal astroid
float ast_size = 10;
int ast_id = 1;
int score = 0;
float hitRate = 0;
int numShots = 0;
int ships = 3;

int pause = 0;

// Run once
void setup () {
  frameRate(60);
  size(500,500);
  stroke(255);
  fill(255);
  
  rabbit= loadImage("rabbit.png"); 
  moon=loadImage("moon.png");
  astroid=loadImage("star.png");
 setupSerial();
 
  
  minim=new Minim(this);
  player=minim.loadFile("Thomas Greenberg - The Right Path.mp3",1024);
}

// Called 60 times per second
void draw()
{
  updateSerial();
  printArray(sensorValues);
  
  colorMode(HSB, 360, 100, 100, 255);

  player.play();
    x1=sensorValues[0];
    y1=sensorValues[1];
    print(x1);

float dis1=dist(250,150,x1,y1);
b=map(dis1,0,430,100,0);
 background(#010C1C);
  imageMode(CENTER);
    tint(0, 0, b);
    image(moon, 250, 250);
    moon.resize(100, 100);

    noTint();
image(rabbit, x1, y1);
    rabbit.resize(50, 50);
 elapsedTime1 = millis()-time;  
    elapsedTime2= elapsedTime1/1000;
    text(elapsedTime2, 20, 70);
    println(millis());
  
   

  
  int i;
  // Find the angle from x=250, y=250 to the mouse
  float angle = atan2(mouseY - 250, mouseX - 250);

  if (pause==0) {

    // 1 new astroid every 5 seconds (60 fps * 4 sec)
    if (astroid_count--==0) {
astroids.add(new astroid(random(0, TWO_PI), random(0.1, 2.5), random(0.5, 4), random(-0.1, 0.1), 
        random(-150, 150), random(-150, 150), ast_id++));
      // Increase rate just a little
      astroid_count = astroid_rate--;
    }

    // Clear screen, black
    //background(0);
    background(#010C1C);
  imageMode(CENTER);
    tint(0, 0, b);
    image(moon, 250, 250);
    moon.resize(100, 100);
  

    noTint();
image(rabbit, x1, y1);
    rabbit.resize(50, 50);

    // Go through all astroids (if any) and update their position
    for (i = 0; i<astroids.size(); i++) {
      astroid a = astroids.get(i);
      if (a.update()) {
        // Remove bullet, if outside screen
        astroids.remove(i);
      }
      // Detect collisions with Astroids by approximating ship with 4 circles
      // fill(160, 33, 100);  
      // ellipse(250, 250, 11, 11);
      // ellipse(13*cos(angle-PI)+250, 13*sin(angle-PI)+250, 17, 17);
      // ellipse(10*cos(angle)+250, 10*sin(angle)+250, 7, 7);
      // ellipse(18*cos(angle)+250, 18*sin(angle)+250, 2, 2);
      if (a.coll(250, 250, 6, -1) ||
        a.coll(13*cos(angle-PI)+250, 13*sin(angle-PI)+250, 9, -1) ||
        a.coll(10*cos(angle)+250, 10*sin(angle)+250, 4, -1) ||
        a.coll(18*cos(angle)+250, 18*sin(angle)+250, 1, -1)) {
        ships--;
        pause=3*60;
      }
    }

    // "pushMatrix" saves current viewpoint
    pushMatrix();
    // Set 250,250 as the new 0,0 
    translate(250, 250);
    // Rotate screen "angle" 
    rotate(angle);
    fill(255);
    // Draw a triangle (the ship)
    triangle(20, 0, -20,-10, -20,+10);
    // Bring back normal perspektive
    popMatrix();
  } else {
    // Pause is larger than 0
    // Clear screen, black
    //background(0, 10);
    background(#010C1C);
  imageMode(CENTER);
    tint(0, 0, b);
    image(moon, 250, 250);
    moon.resize(100, 100);
   

    noTint();
image(rabbit, x1, y1);
    rabbit.resize(50, 50);

    // Go through all astroids (if any) and update their position
    for (i = 0; i<astroids.size(); i++) {
astroid a = astroids.get(i);
      a.incSpeed();
      if (a.update()) {
        // Remove bullet, if outside screen
        astroids.remove(i);
      }
    }
    if (ships == 0) {
      // Clear screen, black
      textAlign(CENTER);
      text("Game Over", width/2, height/2);
      text("Press key R to restart", width/2, 2*height/3);
      // 1 new astroid every 0.5 seconds (60 fps * 0.5 sec)
      // To make something happen while waiting 
      if (astroid_count--==0) {
        astroids.add(new astroid(random(0, TWO_PI), random(0.1, 2.0), random(0.5, 4), random(-0.1, 0.1), 
          random(-150, 150), random(-150, 150), ast_id++));
        // Increase rate just a little
        astroid_count = 30;
      }
      if (keyPressed == true) {
        score = 0;
        numShots = 0;
        ships = 3;
        astroid_rate = 3 * 60;
        astroid_count = 0;
        ast_id = 1;
        astroids = new ArrayList<astroid>();
      }
    } else {
      // Wait until astroids are gone
      if (astroids.size()==0) {
        pause=0;
      }
    }
  }
  for (i = 0; i<shots.size(); i++) {
    shot s = shots.get(i);
    if (s.update()) {
      // Remove bullet, if outside screen or if hits astroid
      shots.remove(i);
    }
  }
  textAlign(LEFT);
  text("Score   : " + score, 15, 15);
  text("Ships   : " + ships, 15, 30);
  text("Hit rate: " + int(100*score/float(numShots)) + "%", 15, 45);
  text("Let's go back to the moon in 5 seconds!",15,60);
  text(elapsedTime2, 15, 75);
  
  if(dis1<25){
    text("YOU SUCCESS!",width/2,3*height/4);
    ships=0;
}
}

// When left mouse button is pressed, create a new shot
void mousePressed() {
  if (pause==0) {
    // Only add shots when in action
    if (mouseButton == LEFT) {
      float angle = atan2(mouseY - 250, mouseX - 250);
      shots.add(new shot(angle, 4));
      numShots++;
    }
    if (mouseButton == RIGHT) {
      astroids.add(new astroid(random(0, TWO_PI), random(0.1, 2.0), random(0.5, 4), random(-0.1, 0.1), 
        random(-80, 80), random(-80, 80), ast_id++));
    }
}
}
class shot {
  // A shot has x,y, and speed in x,y. All float for smooth movement
  float angle, speed;
  float x, y, x_speed, y_speed;

  // Constructor  
  shot(float _angle, float _speed) {
    angle = _angle;
    speed = _speed;
    x_speed = speed*cos(angle);
    y_speed = speed*sin(angle);
    x = width/2+20*cos(angle);
    y = height/2+20*sin(angle);
  }

  // Update position, return true when out of screen
  boolean update() {
    int i;
    x = x + x_speed;
    y = y + y_speed;

    // Draw bullet
    ellipse (x, y, 3, 3);

    // Check for collisions
    // Go through all astroids (if any)
    for (i = 0; i<astroids.size(); i++) {
      astroid a = astroids.get(i);
      if (a.coll(x, y, 3, -1)) {
        score++;
        ast_id++;
        astroids.remove(i);
        //Remove bullet
        return true;
      }
    }
    // End, check if outside screen
    if (x<0 || x>width || y<0 || y>height) {
      return true;
    } else {
      return false;
    }
  }
}
class astroid {
  // An astroid angle, speed, size, rotation
  float angle, speed, size, rotSpeed;
  float position;
  float rotation;
  float xoff, yoff;
  float x, y;
  PShape s;  // The PShape object - Keeps the astroid shape
  float i;
  int id;


  // Constructor  
  astroid(float _angle, float _speed, float _size, float _rotSpeed, float _xoff, float _yoff, int _id) {
    angle = _angle;
    speed = _speed;
    size = _size;
    rotSpeed = _rotSpeed;
    xoff = _xoff;
    yoff = _yoff;
    id = _id;
    if (xoff<1000) {
      x = 250+500*cos(angle)+xoff;
      y = 250+500*sin(angle)+yoff;
    } else {
      x = _xoff-2000;
      y = _yoff-2000;
    }
    rotation = 0; 
    // Generate the shape of the astroid - Some variations for all
    s = createShape();
    s.beginShape();
    s.fill(255, 255, 100);
    s.noStroke();
    for (i=0; i<TWO_PI; i=i+PI/(random(4, 11))) {
      s.vertex(random(ast_size*0.8, ast_size*1.2)*cos(i), random(ast_size*0.8, ast_size*1.2)*sin(i));
    }
    s.endShape(CLOSE);
  }
void incSpeed() {
    speed = speed * 1.02;
  }

  // Update position, return true when out of screen
  boolean update() {
    int i;
    x = x - cos(angle)*speed;
    y = y - sin(angle)*speed;
    rotation = rotation + rotSpeed; 

    // Check for astroid vs astroid collision
    for (i = 0; i<astroids.size(); i++) {
      astroid a = astroids.get(i);
      if ((a != this) && (a.coll(x, y, ast_size*size, id))) {
        if (size > 1) {
          astroids.add(new astroid(angle-random(PI/5, PI/7), speed+random(0, speed/2), size/2, rotSpeed, 2000+x, 2000+y, id));
          astroids.add(new astroid(angle+random(PI/5, PI/7), speed+random(0, speed/2), size/2, rotSpeed, 2000+x, 2000+y, id));    
          ast_id++;
        }
        astroids.remove(i);
      }
    }

    pushMatrix();
    // Set position as the new 0,0 
    translate(x, y);
    // Rotate screen "angle" 
    rotate(rotation);
    // Draw astroid
    scale(size);
    shape(s, 0, 0);
    // Bring back normal perspektive
    popMatrix();

    if (x<-300 || x>800 || y<-300 || y>800) {
      return true;
    } else {
      return false;
    }
  }
boolean coll(float _x, float _y, float _size, int _id) {
    float dist;

    dist = sqrt ((x-_x)*(x-_x) + (y-_y)*(y-_y));

    // Check if distance is shorter than astroid size and other objects size
    if ((dist<(_size+ast_size*size)) && (id!=_id)) {
      // Collision, 
      if (_id>0) id = _id;
      if (size > 1) {
        // If the astroid was "large" generate two new fragments
        astroids.add(new astroid(angle-random(PI/5, PI/7), speed+random(0, speed/2), size/2, rotSpeed, 2000+x, 2000+y, id));
        astroids.add(new astroid(angle+random(PI/5, PI/7), speed+random(0, speed/2), size/2, rotSpeed, 2000+x, 2000+y, id));
      }
      return true;
    } else { 
      return false;
    }
  }
}

void setupSerial() {
  printArray(Serial.list());
  myPort = new Serial(this, Serial.list()[ 2], 9600);
  // WARNING!
  // You will definitely get an error here.
  // Change the PORT_INDEX to 0 and try running it again.
  // And then, check the list of the ports,
  // find the port "/dev/cu.usbmodem----" or "/dev/tty.usbmodem----" 
  // and replace PORT_INDEX above with the index number of the port.

  myPort.clear();
  // Throw out the first reading,
  // in case we started reading in the middle of a string from the sender.
  myString = myPort.readStringUntil( 10 );  // 10 = 'n'  Linefeed in ASCII
  myString = null;

  sensorValues = new int[NUM_OF_VALUES];
}
void updateSerial() {
  while (myPort.available() > 0) {
    myString = myPort.readStringUntil( 10 ); // 10 = 'n'  Linefeed in ASCII
    if (myString != null) {
      String[] serialInArray = split(trim(myString), ",");
      if (serialInArray.length == NUM_OF_VALUES) {
        for (int i=0; i<serialInArray.length; i++) {
          sensorValues[i] = int(serialInArray[i]);
        }
      }
    }
  }
}

Recitation 10

My project is to use a potentiometer to control the degree of the pixel to change the clarity of the image. By putting the data from Arduino, which is controlled by the potentiometer, as the input in Processing, the Processing can change the clarity of the image correspondingly.

As stated in the reading Computer Visions for Artists and Designers, “Myron Krueger’s legendary Videoplace, the computer plays an important role in the process of making an interactive design. So seeing all the examples provided, my teammate and I gained some new thoughts on how to use computer and coding to make the interaction more convenient and entertaining.

Recitation 9 Serial Communication

Exercise 1:

Components: two potentiometers

Code:

  1. Processing
  2. Arduino

Video:

 

Exercise2:

Component: Buzzer

Code:

Processing:

import processing.serial.*;

int NUM_OF_VALUES = 2; /** YOU MUST CHANGE THIS ACCORDING TO YOUR PROJECT **/

  1. Serial myPort;
    int valuesfromArduino;void setup() {
    size(500, 500);
    background(0);

    printArray(Serial.list());
    myPort = new Serial(this, Serial.list()[ 47 ], 9600);
    }

     

    void draw() {
    myPort.write(mouseX);
    myPort.write(mouseY);
    }

    Arduino

    int valuefromProcessing;
    int buzzerPin = 11;

    void setup() {
    Serial.begin(9600);
    //pinMode(buzzerPin, OUTPUT);
    }

    void loop() {
    // to receive a value from Processing
    while (Serial.available()) {
    valuefromProcessing = Serial.read();
    }

    tone(buzzerPin,map(valuefromProcessing,0,600,100,1000));

    // too fast communication might cause some latency in Processing
    // this delay resolves the issue.
    delay(0);
    }

 

Final Project Proposal Essay

Final Project Proposal Essay

A. Project Title

 Melody Painter 

B. Project Statement of Purpose *

The projects we found interested during researching have the same characteristic which is the response in the form of music. And that inspires me to make something that can make music played and composed easier. So we want to make a project that enables everyone to play and compose music and enjoy the fun without worrying about the lack of instruments or techniques.

C. Project Plan *

Our plan can be divided into three steps. Firstly, we plan to use Processing to create a sketchbook. And then we will connect the Arduino with Processing so that the change of the inputs in Processing can lead to a change in Arduino. Finally, the Arduino can pass the order to the circuit of speakers/buzzers to make it make sounds. So when the mouse touches the different ellipses on the sketchbook it will play the corresponding melody to compose its special song. 

D. Context and Significance *

After sharing our new expanded definition of interaction in the last recitation. Emily and I both found that instead restrict the communication to physical activity, responses to the change of one part can also be considered an interaction. So there are two projects inspired us during the searching process. The first one is called Weather Thingy, which just fits our new understanding of interaction in the way that it can play different music in the keyboard corresponding with the change of the weather. So we also want to create something that can combine nonphysical activity and music playing together. The second one is a Processing project which uses changing images and music to respond to the click of a mouse. And it gives us an idea of how to combine Processing and Arduino together. Our intended users are generally people who would like to compose or play with music but do not have much professional knowledge, skills, or instruments to do so. So they can just move their mouse to compose along with playing with the moving images on the screen.

After successfully completed, we think a subsequent project can be designed for the baby since music and changes of images and colour can be a good exercise for their brain development. So instead of using a computer, a toy with many sensors inside can be designed for them.

Recitation 8: Final Research

A. Go back to your definition of interaction as you defined it during the group project. How has your work in this course involved that definition?

In my understanding, interaction means that one part can sense the motion of another part and respond to it. But different from the reaction, which is just a simple response, interaction shows an act upon one another to produce a new effect. Our group project followed this idea and created a locker which can show the user the whether, class schedule, and upcoming events. But we thought we should make it more interactive by adding some psychical responses like sending out the umbrella and class materials.

B. Based on the initial definition and your experience developing and executing your midterm project, how has your definition of interaction evolved? Be as specific as possible.

In the process of creating the midterm project, instead of focusing on the idea, we paid more attention on what can we do in a practical way and the process of how to make it happen by using what we have learned so far. What’s more, we begun with thinking about the target people and their needs first. And then based on their needs to complete our project. I also found that interaction does not necessarily involve two parts’ physical motions, but sometimes, reactions on the changing of the other part can also be considered as interaction.

C. Research two other interactive projects, one that aligns with your definition of interaction and one that differs from it.

1. Created by Adrien Kaeser, Weather Thingy is a custom built sounds controller that uses real time climate-related events to control and modify the settings of musical instruments. This is what I think should also be included in the definition of my original concept of interaction. It is not necessarily be defined as interaction between machines or between people and machine.  

2.

pastedGraphic.png

House Party – Scavenged, mechanical and synthesised orchestra

 Created by Neil Mendoza,  House Party is a musical installation that explores prized possessions in their native habitat. All the materials used to create this artwork were scavenged from the discarded trash. The music is a mix of mechanical and synthesized sounds.

Even though it also relates to musical reactions, this project derives from my definition of interaction in the way that it’s more like a reaction and response than interaction because it only involves one part.

D. Write a “new” definition of interaction in your own words.

Interaction means that one part can sense the motion or the change of another part and respond to that. But different from a simple response, interaction shows an act upon one another to produce a new effect on it. These two parts are not restricted to machine or people but have a broader selection.

Recitation 6 Processing Basics

The image I chose:

The reasons why I chose this image:

First of all, this is a really beautiful image that reminds me of the universe and the planets, which I’m obsessed with. And when looking it closely, I noticed that this is not constructed only by some differently colored circles but all the circles are different in their sizes, strokes, layers, interceptions, and diaphaneities. So I thought this is a good way to practice the basics functions we’ve learned in class. And I also wanted to figure out how to blur the stroke line like the biggest circle of this image does.

Preconceptions:

My initial thought was following the idea of interplanetary and try the maximum ways of constructing circles. Since we already learned how to define the size, filling color, and the position of a circle, and how to add the outline stroke to it and change the color and the degree of thickness of the strokes, my plan begins with placing the different size of circles in different places and try to let them overlap. Then by trying different stroke’s widths and colors, I plan to create the layering.

Similarities and Differences:

Similarly, I used different types of circles to construct the main part of my image. However, during the process of creating this image, I noticed that my image seems more like a halo than a galaxy. So I added a rhombus to illustrate the light source. And I failed to do the blurring effect as the motif. I think processing is a powerful tool to design but since we’ve only learned the basic functions, it was hard to create the layering and three-dimensional effect.

My final code and sketch:

Week 8: Midterm Project Blog Post

1.   PROJECT NAME:

The Doggie (by Jacqueline Wang and Emily Deng)

2. PROJECT STATEMENT OF PURPOSE (150-250 words):  

The issue that our project intends to address and solve is that keeping a pet dog now takes a lot of efforts, cares, and money, which is hard to achieve due to people’s busy lifestyle. But the satisfaction and the sense of accompaniment a dog can bring is also unreplaceable. So we want to create a “dog” that can keep people accompany and be able to play with them without the burden of responsibilities like providing a safe yard for the dog and walking the dog every day. And it’s also safer for people who have little kids in their family. We also consider this as a great idea that can be further used for people who suffer the visual disability since it can be designed as a guide dog, which not only can keep them accompany but also can sense the obstacles on the way and remind its owner.

3.     LITERATURE AND ART, PERSPECTIVES AND CONTEXTS:       

  • Art + Science Now, Stephen Wilson

We found a project called Feral Robot dog in this reading, which is similar to our project. However,  the dog is designed to detect the environmental problems with a fancy pollution sensor. This dog can also perform dog-like actions to inform the users and interact with the users, which is exactly what we want to achieve in our project. We formed our idea of interacting with the user through reading this description.

  • The Art of Interactive Design, Crawford

This is the reading that inspired us before we formed the concept of what we are going to make. Because we want our project to focus more on the interaction and the process of inputting and outputting but we don’t have a clear plan of achieving that. But this reading gives us the concept of interaction, which is focused on “listen, think, and speak”. Since what we learned can not let us achieve the level of thinking, we tried to emphasize the part of listening and speaking. And that made us decide to make a digital dog that can bark and wag tail to interact with people.

4.      PROJECT DESCRIPTION:    

Problem: How to find the sensor and motor that can best function as we expected 

In order to achieve the interactions we expect our project to be capable of doing, we want to find the perfect set of motor and sensor that can support such functions and also not out of our acknowledgment. The two interactions we designed for the Doggie is actually the most foundational pet dog behaviors, which are barking when patted and wagging tail when senses people around. In order to solve that, we started by comparing different sensors and motors that we’ve learned during the lectures and recitations. And we decide to use the IR sensor and the servo for the tail and the pressure sensor and buzzer for the barking. 

At first, Emily and I both find it difficult to decide which sensor or motor should we use. And since we just learned how to use the DC motor to control the “arm” to write and draw, my first choice of the motor that controls the tail id the DC motor. However, during the user testing, we noticed that the DC motor is to big and heavy to hide inside its tail and make it wag, and a faculty suggested us to try the servo instead. So we went back and changed it to the servo. And for the barking part, since we haven’t learned anything about the pressure sensor, we went back to the class slides to look through all the sensors provided and decided to use this one.

5.    PROJECT SIGNIFICANCE:        

From our purpose od this project, I consider our project’s significance as amusement and comforting. The targeted audience of this project is family with little children, people who live in solitude, and the old people. Because for them, having a pet dog in the home to play around and to accompany them without taking the responsibility of taking care of the dog saves a lot of trouble for them, For the long-term value, although this is designed for entertaining, I think it will be great if we can add more features and make it qualified as a guide dog to help people with visual disabilities.

6.      PROJECT DESIGN & PRODUCTION:

We used IR sensor, pressure sensor, buzzer, and servo. This is the decision we made after trying different options, for example, we tried to use the DC motor to make the tail wagging but we changed it since the motor is too heavy and need to add an “arm” to make the tail move. But after we changed it to the servo, it was still hard to make the tail wag well, so instead of using the plastic propeller, I attached a pen cap to it, which fits the tail better, and we tried to add the cotton at different places to hold the servo, as well as make the tail wagging more obvious, and we finally made the tail wag as we expected. And during the User Testing Session, we found that the pressure sensor is not so sensitive and the sound of the buzzer is too little when inserted in the dog’s body. So we changed another pressure sensor and left the buzzer outside to make the sound more clearly be heard. What’s more, during the User Testing Sensor, after looking at others projects, we found the IR sensor can still sense the distance even when hidden in the dog, so we ended up with putting the sensor in the dog’s nose, which made the project looks a lot better.

 7.     CONCLUSIONS

In my opinion, although not complicated, this is. an overall satisfying project. Because during the process, we used what we learned in class and also recitations and combined them together to make this project work. And the result of this project meets our beginning expectation after several times of testing and revising.

Although this project is designed to meet the need of keeping a pet dog for modern people who live a busy and stressful life. I hope the project can bring them a time of relaxation, a sense of accompanying, and also don’t let the children’s dream of having a dog go down.

However, there is still a lot of space for improvement. I found that the lack of the coding knowledge really is a limitation of creating a project since I have to ask my partner to check the code for me, but I also found that I’m better at the handcraft and designing part. But if we have more time, we will definitely try the 3D printing and replace the buzzer with a speaker that can imitate the real dog bark sound. What’s more, after seeing other group’s project, I think we can also add the motors in its legs to make it able to move and run.

 

Demo Video:

References:

Crawford, Chris. The Art of Interactive Design: an Euphonious and Illuminating Guide to Building Successful Software

Stephen Wilson, Art + Science Now.

 

Recitation 4

Documentation:

There are three steps we were asked to do. The first two steps were required to do by ourselves and the third one we need to do in pairs.

Materials:

For Steps 1 and 2

  • 1 * 42STH33-0404AC stepper motor
  • 1 * SN754410NE ic chip
  • 1 * power jack
  • 1 * 12 VDC power supply
  • 1 * Arduino kit and its contents

 

For Step 3

  • 2 * Laser-cut short arms
  • 2 * Laser-cut long arms
  • 1* Laser-cut motor holder
  • 2 * 3D printed motor coupling
  • 5 * Paper Fasteners
  • 1 * Pen that fits the laser-cut mechanisms
  • Paper

Question 1:

What kind of machines would you be interested in building?Add a reflection about the use of actuators, the digital manipulation of art, and the creative process to your blog post.

 

Question 2:

Choose an art installation mentioned in the reading ART + Science NOW, Stephen Wilson (Kinetics chapter). Post your thoughts about it and make a comparison with the work you did during this recitation. How do you think that the artist selected those specific actuators for his project?

Group Project

Part 1:

In my understanding, interaction means that one part can sense the motion of another part and respond to it. But different from the reaction, which is just a simple response, interaction shows an act upon one another to produce a new effect.

Part 2:

A.Face Trade  created by Matthias Dörfelt

Face Trade – Art vending machine that trades mugshots for “free” portraits

The project I choose creates a playful tension between what personal, biometric information people are willing to give up to be stored in a permanent and public storage in order to receive a “free” item. It looks like a photo booth but different from taking a picture, it can catch an expressive face that visually lives somewhat in a space between Mask and an actual face with an intentional redundancy in some of the facial features.

B. The chAIr Project

This is the project that I’m not fond of. The goal of this project, as described, is “exploring the reversal of human and machine roles in the design process and industrial production. It explores co-creativity between humans and AI, taking the chair — the archetype of a designed object — as an example.”  But from my perspective, I cannot see any strong connection between these two throughout the process and the result. And I think the interactivity is not so strong as well.

 

Part 3:

Because our group want to create something that we need to use a lot in our daily life, we first shared ideas that we had when we were asked in class to design something for our partner. And after discussion, we choose to create a locker with a bunch of additional functions, for example, face ID or fingerprint, an internal screen that can show the class schedule, weather forecast, upcoming events, news, and even playing music. What’s more, we want it to be able to interact with the user even more, so we plan to insert some voice sensors, which came from the idea of Siri in Apple, to recognize the command and response to it. And we also want it to have a conveyor belt that can easily put the things in front of you when needed. For example, when the forecast shows it’s rainy outside, it can ask you whether do you need to use your umbrella or not, and transmit it to you.  So we found the box in the room and cut a door to pretend it as a locker, then we used the paper to show as screens with different functions. And in order to make the paper screen to go through the box more smoothly, we used tape to plastic-coated them.