Week 4: Filed Recording- TianziFang

For this weekend, we need to go back to the original place to do another field recording. So I went back to TianziFang again. Unluckily there was no microphone I can borrow in ima lab, so I had to use my phone instead. As I tested, iPhone 7 also can get a clear, pure recording because of the technology of reducing noise on both hardware and software, even though comparing to the professional equipments, it was still not that good.

So I went back to the same little bar next to the restaurants at night. People are kind of noisy at that time but I still grabed some sound details. This time the restaurant next to the bar invited some people to play violins( or some instruments I don’t know?). The players were not standing there, they moved around, so the sound recourses are changed I don’t know if you can tell from this file. Also, I noticed that the waiter was using long-mouth teapot to serve the tea, so I ran to there and record some sounds of it which is pretty interesting because it sounds like someone peeing lol.

Screen Shot 2017-03-01 at 12.54.16 PM Screen Shot 2017-03-01 at 12.53.47 PM

Week 3: Complete Action Recording (CHEN)

I went to the AGIF office to record my action. My friend Danny was setting up the lobby area for a meeting. The very beginning of the recording you hear us walking into the lobby where the meeting will take place. He first brought in more chairs and scooted other chairs around. I had my input levels set at 80, but didn’t realize how loud the chair moving would be, so I had to adjust the input levels as well as trying to move further away from the sounds. I then followed him out the door into the kitchen to pick up some apples and bananas. I was able to pick up some nice sounds of Danny punching in a code on a keypad. We walked through the halls and I picked up some nice conversational sounds. Back in the lobby, Danny was setting up his computer to have the correct slides up and ready. Then the recording ends.

Most of the pitches were low (beside the beeping of keypad and some high-pitched voices of women). Besides the relatively long stretches of walking sounds, most sounds ended quickly and were either repeated (moving chairs) or didn’t happen again (keypad). The sound of the chairs gave me a feeling of brown and raw while the keypad gave me a sense of tinyness and maybe a yellow color. The keypad was definitely staccato sounding. The chairs moving felt like half notes. They came in not attacking, were there for longer than the keypad, but not super long, then stopped for the next sound. Most of the sounds were coming from in front of me as I was walking and standing. I tried to stay out of Danny’s way and not stand too close to the loud noises.

Week for: record action

So I went back to the coffee shop where I was recording last week and first I recorded a person buying hot chocolate but when I realized it was less than 2 minutes , I decided to do something else. I recorded a person working on their laptop and drinking coffee. This is something really typical for this coffee shop and I guess for most coffee shops in cities. I wonder why don’t these people go home and write their emails there? Is it possible in our urban environment to stop working? Thus, this audio piece represents the new never stopping, never quiet Shanghai to me where people never stop, not even for a coffee. I also don’t quite understand how can people work in an environment where the music is laud. Are they filtering out the noises of the city? I sometimes realize that if I really want I can lock out all the noises of my environment from my brain and through this assignment I realized how much I don’t listen, I don’t even hear the clicking noise of my keyboard anymore or the sound of drinking coffee.

 

I had some problem with NAS so I uploaded the audio to SoundCloud.

 

Assignment 4b: The Mind of a Robot

The central theme of The mind of a robot is intelligence. In the article, Brady and Hu discuss the various abilities that characterize theory of mind in the context of robots and artificial intelligences. According to their framework for theory of mind, any intelligent being must be able to react and plan, continually interpret and process information, adapt to uncertainty, have purposive behavior, and display emergent properties. In their detailing of these abilities, the authors bring up the contemporary state-of-the-art at the time, 1994.

Overall this was a very interesting read. The authors make compelling arguments for each of their points by substantiating their claims and illustrating with robotics. With their five enumerated abilities, the authors seem to be headed in the direction of artificial general intelligence and generalized learning. Furthermore, the points that Brady and Hu make throughout the article are in line with current ideas on how to solve this ongoing problem.

The article itself shows a high regard for robots. While the article reads as a technical report, the philosophy of intelligence is a core theme. While robots are not yet fully intelligent, the authors’ treatment thereof suggest that they already conceptualize robots as intelligent beings.

Robotics | Assignment 4b | Response to “The Mind of a Robot [and Discussion]” | Gao Yang

In The Mind of a Robot [and Discussion], Michael Brady and Huosheng Hu talked about the usage of AI in robot technology. AI stands for Artificial intelligence, which will add more function to robots that can make it act more like human or more fit human’s need.

Nowadays, robots become more and more intelligent, the technology of AI becomes more and more popular. Like the authors say in the article, “work in robotics offers the study of intelligence insights gleaned from building autonomous agents that operate purposefully in the real world”. AI technology will stronger enough in the future and let the robots “have mind”. Then they can serve the human better and better.

When I read the article, several questions come to my mind, what if the robot is as smart as human or even smarter than human? If so, how can people control robots? How to make sure robots will not attack people? Can human use the AI technology on robots to apply on ourselves? If so, are we human or robotic human? How’s the difference between each people? Will we be the same in the end? I think there are several rules to control the potential problems that may harm human beings and people can treat the technology correctly.

 

Assignment 4b

In all honesty, most of the terms just flew right over my head, but as for the basic gist, I see the evolution of robots as something quite similar to how my idea of robots has been evolving over class. To begin with, we started with something basic, like our car, but then there’s actually more that we can do, especially as the more we learn and develop. And with each iteration and discovery, the robots just get more and more refined.

One of the things that really stand out to me from the reading was the Kalman filter because of the math formula that I just completely glossed over, because I didn’t understand it all. But of course, someone had to come up with that formula, and someone had to come up with the programs and the robots that we’ve been using. And people are dreaming so big about robots these days, and I can’t help but feel a little bit of awe for the people trying to make something that doesn’t even exist.

Second Recording: Tianzifang Street food

This sound is tasty!

For this recording, I went back to Tianzifang in the afternoon. It was Sunday afternoon, there were many people hanging out there. I set up my mic and tascam on the table that I used last time, and headed off the streets to find sounds to recored.

This time, I am more familiar with the mic and tascam. So I can hold them tight and gently at the same time to avoid scrubbing my jacket. And this time, the goal is more clear in terms of finding sound to recored. I think it would be better to find a handcraft store and recored the process of hand crafting. In Tianzifang, you can find many stores selling hand crafts, but it need a little time to fins a store that actually does handcraft. So it takes some time. But as I was walking, I practice holding the mic and adjusting the input volume.

The first handcraft store I find is a silver decoration store. A guy was sitting on a chair, holding a lighter to fire the little silver rings. He was going to make a bracelet. I asked for his permission to recored the process. He said ok. And I enjoyed recording it while watching him firing the silver rings. That was a pleasant sound. I bet he enjoyed the sound while working too. But the problem with recording is: the background noise was too loud, and it is because there was another sells man yelling on the door of the store advertising their silver decoration. I tried to put the shot gun mic as close as possible to get the sound of the fire and also the sound of silver tingled. But I didn’t want to burn up the wind blocker.

So, I needed to find another sound. As I walked along the streets, I was attractive to a tasty sound. So I went closer to have a look. There was a lady making street food with shrimp and flour and vegetables. Listen to the sound and you will hear her talking to the customers, her peddling and also a man sining in the back, he was preparing some vegetables.

 

Week 3: Bluetooth(Network Everything):

Our first step to completing this assignment was getting our bluetooth connected. We had a lot of problems doing so. Our first problem was the actual bluetooth modem, which we later switched out.

Here is a video of the arduino side of things working with the values printing out on the serial monitor.

Even though we knew the values were sending through bluetooth, we were kind of confused because suddenly the processing game stopped working and even the circuits started acting strangely.

Eventually, with the help of Scott, we realized that part of our processing code was incorrect. We didn’t need a whole chunk of it, because it turned out to be redundant.

After we deleted the code we needed to, we added several things to the game. First, we changed the background. Then, we added a need ship. Lastly, we added a start screen. The user clicks the ‘a’ key to start the game. Adding these elements was kind of simple, but we did have some structural problems with our code that gave us some trouble. Having to practice this was good for us though, because we had to play with our code for a while. The bluetooth took the longest amount of time to figure and we realized the physical bluetooth connection to arduino was much more stable, but overall, this assignment was a good(and frustrating) challenge.

Here is a video of the final game:

17015384_10211464002536482_1541813188_o

Here is our final code:

import processing.serial.*;
String serial;
String values;
int speed, left, right, up, down;
int sensors[];
//int mysensors[];
Ship myShip; // create a spaceship
Serial port;
int xChange, yChange; // variables for the ship
int increment = 1; // variable to change the rate of movement
int speed2;
Star[] stars; // array of stars
int starNum = 20; // number of stars
PImage img1;
PImage img2;
boolean initialScreen = true;
void setup() {
img1 =loadImage(“Earth_and_Moon.jpg”);
img2= loadImage(“new_rocket.png”);
size(800, 600);
printArray(Serial.list());
port= new Serial(this, Serial.list()[1], 9600);
// port.clear();

port.bufferUntil(‘\n’);

myShip = new Ship(); // instantiate the ship
stars = new Star[starNum]; // new array of stars

for (int x=0; x<starNum; x++) {
stars[x] = new Star(); // instatntiate the stars
}
}
void draw() {
if (initialScreen == true) {
initialScreen();
} else {
the_game();
}

//image(img1, 0, 0, width, height); // clear the background

//if (initialScreen == false) {

// the_game();
//}
//println(speed2,”/”, serial);
}

void the_game() {
// directions
image(img1, 0, 0, width, height);

String words = “use the buttons to move, move potientiometer for ship speed”;
fill(255);
text(words, 10, 30);

// loop through all the stars
for (int x=0; x< starNum; x++) {
stars[x].update(); // update their position
stars[x].collisionCheck(myShip.xPos, myShip.yPos); // check if colliding with the ship
stars[x].render(); // draw the stars
}

myShip.update(xChange, yChange); // update the ship’s position & shield size
myShip.render(); // render the ship

// reset vars if you want
//yChange = 0;
//xChange = 0;

// ************
// change the speed of the ship
// this will be replaced by serial code
// ************
if (keyPressed == true) {
int pressed = key;
if (pressed >= 49 && pressed <=57) {
increment = pressed – 48;
}
}
}

void serialEvent(Serial port) {
if (port.available()>4) {
serial = port.readStringUntil(‘\n’);
print(serial);
if (serial!= null) {
serial= trim(serial);
int mysensors[]=int(split(serial, ‘\t’));
printArray(mysensors);
//int count=mysensors.length;
//for ( int i=0; i<count; i++) {
// sensors[i]= mysensors[i];
// //print(i+”:”+mysensors[i]+”\t”);
//}

speed=mysensors[0];
left=mysensors[1];
right=mysensors[2];
up=mysensors[3];
down=mysensors[4];
//println(speed, left, right);

if (up == 1) {
yChange = increment * -1 *speed/100;
}
if (down == 1) {
yChange = increment *speed/100;
}
if (left == 1) {
xChange = increment * -1 *speed/100;
}
if (right == 1) {
xChange = increment *speed/100;
}
}
}
}

// move the ship
// this will be replaced by serial code
void keyPressed() {
if (key == CODED) {
if (keyCode == UP) {
yChange = increment * -1;
}
if (keyCode == DOWN) {
yChange = increment;
}
if (keyCode == LEFT) {
xChange = increment * -1;
}
if (keyCode == RIGHT) {
xChange = increment;
}
}
}

void keyReleased() {
if (key ==’a’) {
initialScreen = false;
}
}

//************** Star class
class Star {
float xPos, yPos, starSize, speed; // variables
boolean collision; // check for collision

// star constructor
Star() { // initial state
//fill(random(255), random(255), random(255), random(255));
speed = random(1, 10);
starSize = random(10, 100);
xPos = random(0, width);
yPos = random(100, width) * -1;
collision = false;
}

void update() { // update star position

yPos += speed;

if (yPos > height+starSize/2) {
yPos = random(100, width) * -1;
speed = random(1, 10);
starSize = random(10, 50);
xPos = random(0, width);
}
}

void collisionCheck(int _x, int _y) { // check for a collision

int shipX = _x;
int shipY = _y;

float dx = shipX – xPos;
float dy = shipY – yPos;
float d = sqrt(sq(dx)+sq(dy)); // distance between star and ship

if (d < starSize/2 + 10) { // if distance is less than the radius of the star & ship
collision = !collision; // there’s a crash
}
}

void render() {
// if there’s no collision
if (!collision) {
noStroke();
fill(220, 160, 0);
ellipse(xPos, yPos, starSize, starSize);
} else { // if there is a colliison, supernova
strokeWeight(5);
stroke(255);
fill(220, 100, 0);
ellipse(xPos, yPos, starSize*1.5, starSize*1.5);
collision = !collision; // reset the collison state for the next iteration
}
}
}

//************** Ship class
class Ship {
int xPos;
int yPos;
int shieldSize;

Ship() { // ship constructor
xPos = width/2;
yPos = height-100;
shieldSize = 0;
}

void update(int _xDelta, int _yDelta) {
xPos += _xDelta;
yPos += _yDelta;
}

void render() {
if (yPos > height-10) {
yPos = height-10;
}
if (yPos < height-200) {
yPos = height-200;
}

if (xPos > width-10) {
xPos = width-10;
}
if (xPos < 10) {
xPos = 10;
}

image(img2, xPos, yPos);
}
}

void initialScreen() {
image(img1, 0, 0, width, height);
fill(0);
text(“click A to start”, 400, 400);
}

week4 recording –by Emerald

To record this, I went back to Starbucks again, but this time I went there at night and there was only a few people in the Starbucks. The whole audio captures the process of a waiter making last several cups of coffee, and another waiter cleaning the machine, dumping the waste, and clearing up the whole store. In the recording, you can hear the sound of making coffee, the sound of ice crusher working, and the sound of washing, cleaning up, moving chairs. After all these things were done, the shopping mall was about to close, and although Starbucks would open longer, it had also prepared to close that night.

Transition Diagram/Narrative Storyboard/Design Process (Gabriela Naumnik)

Interface State Transition Diagram

Project Partner: Sophia Noel

Screen Shot 2017-02-28 at 19.22.05 Screen Shot 2017-02-28 at 19.22.14 Screen Shot 2017-02-28 at 19.22.21 Screen Shot 2017-02-28 at 19.22.28

1st Photo shows: 1st interface, registration interface Nr1, registration interface Nr2 (company number will be provided to employees; for consultants  first id character is 1, for other employees it is 0)

2nd Photo shows: registration confirmation, login (after confirming membership), consultant view

3rd Photo shows: consultant profile (click to change), consultant time tracking window, consultant summary (+ send to email)

4th Photo shows: other employee main screen, other employee payment summary.

User Flow

Screen Shot 2017-02-28 at 19.47.45

Design Process

Screen Shot 2017-02-28 at 19.52.05

Narrative Storyboard

 Narrative_Storyboard

Update (added on March 1st): Here is the Interface State Transition Diagram with comments

Screen Shot 2017-03-01 at 15.25.18 Screen Shot 2017-03-01 at 15.25.25 Screen Shot 2017-03-01 at 15.25.34 Screen Shot 2017-03-01 at 15.25.41