Kinetic Interfaces – week 12 OSC weekly Assignment (Echo)

For OSC weekly assignment, I created a synchronised drawing board. The sender can be used to draw colourful lines and the drawing will be displayed on the receiver’s screen at the same time. Here is the demo:

demo1

 

import oscP5.*;
import netP5.*;


OscP5 oscP5;
NetAddress myRemoteLocation;


void setup() {
  size(400,400);
  background(100);
  
  oscP5 = new OscP5(this,12000);
  myRemoteLocation = new NetAddress("127.0.0.1",12001);
}


void draw() {
  //
}


void mouseDragged() {
  OscMessage msg = new OscMessage("/Echo");
   
  // let's send mouseX&Y positions
  float x = map(mouseX,0,width,0,1);
  float y = map(mouseY,0,height,0,1);
  float X = map(pmouseX,0,width,0,1);
  float Y = map(pmouseY,0,width,0,1);
 
  msg.add( x );
  msg.add( y );
  msg.add( X );
  msg.add( Y );

  oscP5.send( msg, myRemoteLocation );
  
  stroke(random(255),random(255),random(255));
  line(mouseX, mouseY, pmouseX, pmouseY);
}
import oscP5.*;
import netP5.*;


OscP5 oscP5;

int x,y,X,Y;
float xValue = 0;
float yValue = 0;
float XValue = 0;
float YValue = 0;

void setup() {
  size(800, 500);
  background(100);

  oscP5 = new OscP5(this, 12001);
}


void draw() {
  int x =(int)map(xValue,0,1,0,width);
  int y =(int)map(yValue,0,1,0,height);
  int X =(int)map(XValue,0,1,0,width);
  int Y =(int)map(YValue,0,1,0,height);
  stroke(random(255),random(255),random(255));
  line(x,y,X,Y);
}

void oscEvent(OscMessage msg) {
  println("___");
  println("Pattern: " + msg.addrPattern() );
  println("Typetag: " + msg.typetag() );
  println();

  xValue = msg.get(0).floatValue();
  yValue = msg.get(1).floatValue();
  XValue = msg.get(2).floatValue();
  YValue = msg.get(3).floatValue();
  println("x" + xValue);
  println("y" + yValue);
  println("X" + XValue);
  println("Y" + YValue);
  println();
  }

Kinetic Interfaces – Final Project: Line Run (Billy & Echo)

Title: LINE RUN

Subtitle: A 3D running game played by Kinect.

Team Members: Billy Zou & Echo Qu

Description:

The LINE RUN is a 3D running game played by Kinect. The players are asked to do different poses, including squat, jump and so on to avoid obstacles and obtain stars to power the shield. Once the bar of the shield is full, the players can have one more life. The goal of this game is getting higher and higher scores.

Conceptual Development:

Running games are very popular recently but young people always play this kind of games on their phones, which means they must lower their head for a long time. We think it is more interesting and healthy to combine Kinect and running games together so that the players can enjoy playing running games and doing exercise at the same time.

Technical Development

In terms of Processing, this is our first time that we made an attempt to do a 3D visualized game, that’s why we tried simple lines. In terms of Kinect, we used skeleton tracking to detect the players’ poses.

Demo

The Beginning Interface including Title and Instruction:

The Playing Interface:

The Ending Interface:

Demo:

My Work:

After I proposed the basic concept of this game, we built and improved the idea together. Most of the coding work was done by Billy. I was assigned to test the running pose (the two hands on the sides of the screen) because the sizes and positions of the pictures of hands need to be adjusted often. As Prossessing doesn’t support gif, we use ArrayList and frameCount to show a list of pictures at a fast speed to imitate the feeling of gif.

I designed the icons and items in this project.

Future Potential:

After the guest critique in the final presentation, we realize there are a lot of improvements we can make. Firstly, we need to make it a little bit harder to begin the game to avoid the players beginning the game without reading the instruction carefully. Secondly, we need to make how to play the game more clearly and a good way is at the beginning of the running journey, when the players meet with the obstacles or stars at the first time, we can slow down the game and give the players clear hints of what they should do. Moreover, for the obstacles that they players need to jump or squat, we should give more information about when should they start to do the pose and when should they stop. Thirdly, for a better playing experience, we can make the speed of running faster and faster.

On the whole, I am very satisfied with our final results. I really enjoy the processing of working with my partner Billy on our project as I learn a lot from him! Last but not least, Kinect Interface is a pretty interesting class and Moon is an amazing professor!!!

ArrayList<pose> poses=new ArrayList<pose>();;
PImage pose1;
PImage pose2;
PImage pose3;
PImage pose4;
PImage pose5;
PImage pose6;
PImage pose7;


void setup(){
size(1920,1080);
imageMode(CORNER);
background(0);
pose1 = loadImage("pose1_.png");
pose2 = loadImage("pose2_.png");
pose3 = loadImage("pose3_.png");
pose4 = loadImage("pose4_.png");
pose5 = loadImage("pose5_.png");
pose6 = loadImage("pose6_.png");
pose7 = loadImage("pose7_.png");
}

void draw(){
  background(0);
poses.add(new pose(pose1));
poses.add(new pose(pose2));
poses.add(new pose(pose3));
poses.add(new pose(pose4));
poses.add(new pose(pose5));
poses.add(new pose(pose6));
poses.add(new pose(pose7));
pose p =poses.get((frameCount / 4) % 7);
p.display(0,0);
}

class pose{
PImage posePic;

pose(PImage _posePic){
posePic=_posePic;
}
void display(int x,int y){
image(posePic,x,y);
}
}

Kinect Interfaces – Week 14 Final Project (Echo & Billy)

Our final project is a 3D Kinect running game. The concept behind this game is to create an abstract life journey, but very immersive. The player will run on the runway with the first person perspective and he needs to avoid obstacles and collects items as many as possible by posing. For example, standing on one leg or strething out arms. Here, we compare obstacles as difficulties in our life and items as precious memories. The player needs to change poses frequently to make sure they can avoid obstacles and collect items at the same time, so it is a game that challenges players’ balance and reactivity ability. We also design that collecting items will make some changes to the players hands which are the only part of body that they can see when they play this game, because we want to show the process of reversion of self-identity. To make it more immersive, we will use projector to map the interface of this game.

Kinect Interfaces – Week 11 Weekly Assignment

For this week’s assignment, I combined Kinect and buttons together to create a very simple interaction.The interaction part is that when the closest point is on the button, the button will be filled by red color. I hope it can create a feeling that the user “touched” the button. I did this in the dorm and I found it was very difficult to test it as there were many stuff in my room and the sensitivity of tracking the closest point was not very good.

 

 

Kinect Interfaces – Week 10 Kinect (Echo)

In this weekly assignment, I planed to create a 3D model like Maya by what we learnt in class about Kinect, but I found that I still felt confused about how to use points clouds by myself and could not create more things beyond sample code. I tried to add connected lines to make points clouds like a web, but it becomes super slow and the result is not that good as I thought.

Kinect Interfaces – Midterm Project Magic War (Echo&Billy)

Title:

Magic War

Subtitle:

A double battle game played by Leap Motions.

Description:

Magic War is a game that two little magicians attack each other by using different magic skills. The players need to use different gestures to control their characters to move and battle against each other. We meant to design this game as a kind of difficult game because players must use both of their hands to play it and it is a challenge for their coordination actually, so that the players can have some exercise when they play games.

Ideation:

The mode of this game comes from our experience of a lot of battle games and tower defence game. It is a simple mode but has a lot of space of development.

Design:

The part that we spent most of the time designing was the gesture part. It is important that the players can easily remember the gestures and will not mix them up when they play the game. So we designed the left hands have three gestures for the players themself including moving, storing energy and blocking damage; while the right hands have three gestures for the other players, in other word, for attack, including three different magic skills. We also chose simple gestures as making a fist, point index fingers and pinch.

We also designed some other details like the degrees of block for different skills are different and some skills can swallow other skills’ damage. Additional, in terms of visual design, The characters and the effects of magic were designed by me.

Technique:

The biggest problem we had was how to play this game by two Leap Motions as it was not supported by Leap Motion that two of them were connected to one laptop. We did a lot of research and finally Billy figured out one of the functions of thunderbolt is data transmission and the speed of it is fast enough for supporting our game. He bought the data transmission cable online and after making many attempts, he made it! This big step is the basis of our whole project and thanks for Billy’s advanced knowledge of computer science, we succeed in starting our project.

Lessons:

During the process of making the midterm project, I realized how poor my ability of coding and the contribution I could make to our project was just too limited. I really need to learn more and more! Another important thing I learnt from my paterner was that there will also be a way to solve problems and what you need to do is keeping searching for and trying! Before the moment we finished the whole project, I totally didn’t expect that we could do it such amazing!

Demo:

Kinetic Interfaces – week 6 Homework Leap Motion

Band Conductor

My assignment this week is a little program that the user can experience the feeling of being a band conductor. The song and cartoon characters come from a Japanese animation called K-ON. Basically I used several images as “buttons” and when the users use their hands to “touch” them, the audio tracks of different musical instruments will play. And I also used the way of creating pinch by constraining distance, but in my project, when the user pinch, the audio track will pause. To avoid the music becomes a mess, I added a very simple tips which are several green lights to help the users to know when they should play the audio tracks.

If I have more time, I will try to figure out how to make sure the pause won’t affect the whole music, which means whenever the users play any of these audio tracks, it plays from the correct position. During the process of testing my codes, I also felt that the sensitivity of Leap Motion is not that good to support some elaborate actions.

 

Kinetic Interfaces – week5 Assignment

My idea of this week’s homework is a little game of catching bubbles. What I planned to do is creating a game with live camera and face detection. The principle is quite simple that the screen will generate many bubbles and I can move my face to catch them. My original idea is using my mouth to “eat” bubbles but I found mouth detection in openCV is too insensible to achieve that. There is no difficulty for me to make myself appear on the screen by camera and generate bubbles on the screen, but I feel quite challenging for how to judge my face touches bubbles.

I searched a lot of reference code, and the only idea that I thought I could try was trying to detect the movement of pixels(though I feel really confused about it 🙁 ). However, it was a very old coding and used some functions that had been abandoned. So I spent a lot of time modifying them and finally applied them to my original coding. Though it seems that it works in a way (not completely reach to my expectation), I still don’t understand the way of using “brightness()” in this coding and feel like these bubbles are still beyond my control. For example, I don’t know why they generate only on the left-half screen at the beginning. And when I tries to add sound, it became a terrible mess. Another question is by doing in this way, it doesn’t make sense that I used face detection function.

 

import processing.video.*;
import gab.opencv.*;
import java.awt.Rectangle;
import processing.sound.*;

SoundFile sound;
OpenCV opencv;
Capture cam;
//Rectangle[] faces;
PImage smallerImg;
int bubblenum;
ArrayList bubbles;
int scale=4;

void setup(){
  size(640,480);
  
  sound = new SoundFile(this,"sound.wav");
  cam = new Capture(this,width,height);
  cam.start();
  
  opencv = new OpenCV(this, cam.width/scale, cam.height/scale);
 opencv.loadCascade(OpenCV.CASCADE_FRONTALFACE); 
  
  smallerImg = new PImage(640,480);
  bubblenum = 0;
  bubbles = new ArrayList();
 
}

void draw(){
  bubbles.add(new Bubble((int)random(width), 0));
  if (cam.available() == true) {
    cam.read();
    cam.loadPixels();
    
     smallerImg.copy(cam, 
      0, 0, cam.width, cam.height, 
      0, 0, smallerImg.width, smallerImg.height);
    smallerImg.updatePixels();
  }
  image(cam,0,0);
  noTint();
  opencv.loadImage(smallerImg);
  //faces = opencv.detect();
  
  //if (faces != null) {
  //  for (int i = 0; i < faces.length; i++) {
  //    strokeWeight(2);
  //    stroke(255, 0, 0);
  //    noFill();
  //    rect(faces[i].x*scale, faces[i].y*scale, faces[i].width*scale, faces[i].height*scale);
  //}
  //}
  
 for (int i=0; i<bubbles.size(); i++) {
  noStroke();
     Bubble b = (Bubble)bubbles.get(i);
     
     if(b.update()==1){
     bubbles.remove(i);
     b = null;
     i--;
     //sound.play();
     }else{
     bubbles.set(i,b);
     b = null;
     }

}
text("Bubbles:" + bubblenum,20,40);

}

class Bubble{
  int bx, by, size;
  Bubble(int tempX, int tempY){
    bx = tempX;
    by = tempY;
    size = 20;
    
  }
int update(){
  int movementAmount= 0;
  
  for(int y=by; y<by+size-1;y++){
  for(int x=bx;x<bx+size-1;x++){
    if(x<width && x>0 && y<height && y>0){
      if(brightness(smallerImg.pixels[x+(y*width)])>127){
       movementAmount++;
      }
    }
  }
  }
  if(movementAmount >5){
  bubblenum++;
  return 1;
  }else{
    by+=1;
    if(by>height){
    return 1;
    }
    fill(random(255),random(255),random(255));
    ellipse(bx,by,size,size);
    return 0;
  }
}
}

Kinetic Interfaces – Homework 3 Pixels (Echo)

For the pixels assignment, my original idea is creating something based on the in-class 2D noise example because I really like that feeling. Based on that example, I think of “galaxy” and decided to create a galaxy image.

I think 2D noise example can be a good sky background as long as I change the color a little bit. However, it is not that easy. For creating a purple tone, I use HSB color mode, and I tries many times to get the best balance. Then I use practices that we learned before to create stars. My original plan was creating an universe feeling, but gravity and attraction are difficult for me, so finally I chose the feeling of shooting stars. Then I met with a big problem that if I want a “shooting” star, I need to decrease the transparency, but if I decrease transparency, the color of 2D noise will become quite weird.

PImage img1;
int number=800;
Ball[] balls = new Ball[number];

void setup(){
  size(1000,800);
  noStroke();
  colorMode(HSB,360,100,100,100);
  
  img1 = createImage(width, height, HSB);
 
  for(int i=0;i<number;i++){
    PVector loc = new PVector(random(width),random(height));
    float angle = random(TWO_PI);
    PVector dir = new PVector(cos(angle),sin(angle));
    float speed =random(1,3);
    balls[i] = new Ball(loc,dir,speed);
  
  }
}

void draw(){
  fill(200,50,50,10);
  rect(0,0,width,height);
  
  int index1 = 0;
  img1.loadPixels();
  for(int y=0; y<img1.height;y++){
    for(int x=0; x<img1.width; x++){
      float freq = 0.01;
      float noiseValue = noise((x+frameCount)*freq,(y+frameCount)*freq);
      
      float h = map(noiseValue, 0, 1, 200,280);
      float s = map(noiseValue, 0, 1, 0, 80);
      float b = map(noiseValue, 0, 1, 0, 80);
      float a = 30;
      
      img1.pixels[index1]=color(h,s,b,a);
      
      index1++;
    
    }
  
  }
  
  img1.updatePixels();
  image(img1,0,0);

  fill(255,155);
  for(int i=0; i<800;i++){
    balls[i].move();
    balls[i].checkEdges();
    balls[i].display();
  
  }
  
}

class Ball{
  PVector loc, dir, vel;
  float speed;
  color clr;
  
  Ball(PVector _loc,PVector _dir, float _speed){
    loc = _loc;
    dir = _dir;
    speed = _speed;
  }
  
  void move(){
    float angle = noise(loc.x/500, loc.y/500, millis()/10000)*TWO_PI;
    dir.x = cos(angle);
    dir.y = sin(angle);
    vel = dir.get();
    vel.mult(speed);
    loc.add(vel);
  
  }
  
  void checkEdges(){
    if(loc.x<0||loc.x>width||loc.y<0||loc.y>height){
      loc.x = random(width);
      loc.y = random(height);
    }
}
  void display(){
    ellipse(loc.x,loc.y,2.5,2.5);

}
}

Kinetic Interfaces – Week 3 Button Homework(Echo)

My homework this week is called Color Disc. I got the inspiration from the color wheel coding of my friend who is taking Processing Lab course in ACCD, and I wanted to create a dynamic one. After I succeeded in creating a dynamic color wheel, I tried to combine it with “button” and my original idea was when my cursor stayed on certain color, I can pick that color and when I pressed my mouse, there would be some changes happening.

However, I really met with problems when I designed the “changes”. I planned to draw some patterns by using the color I picked, but if I changed the background, it will cover the color wheel, and because I could only put this part of coding in void draw(){}, I cannot leave what I draw on the page, which means as long as I release my mouse, they will all disappear. Also, I tried function noise() to draw random curves to simulate audio tracks, but it didn’t achieve my expectation. I also failed to add noise with mosaic when I clicked black part because there was always sharp noise when it ended.

import processing.sound.*;

SoundFile sound1;
//SoundFile sound2;



float x =width/2;
float y =height/2;
float deg;
int state = 0;
color clr;

float soundRate;

void setup(){
  size(500,500);
  colorMode(HSB,360,100,100);
  sound1 = new SoundFile(this,"melody.mp3");
  //sound2 = new SoundFile(this,"noise.mp3");
  
  sound1.play();
  
  }


void draw(){
  background(0,80);
  deg++;
  
  //when hover, the trans of mouse's location will decrease according to distance 
   float distance1 = dist(mouseX, mouseY, width/2,height/2);
   if(distance1<200){
     x = lerp(x, mouseX, 0.5);
     y = lerp(y, mouseY, 0.5);
   }else{
     x = lerp(x, 150*sin(radians(deg))+width/2,0.1);
     y = lerp(y, 150*cos(radians(deg))+height/2,0.1);
   
   }
    
    //draw triangles to create a color wheel, the colorstep is 5
    for(int i=0; i<360; i+=5){
      float x1 = 200*sin(radians(i+5))+width/2;
      float y1 = 200*cos(radians(i+5))+height/2;
      float x2 = 200*sin(radians(i))+width/2;
      float y2 = 200*cos(radians(i))+height/2;
    
    //the mouse is closer to the edge of circle, the trans is lower  
      float distance2 = dist(x,y,x1,y1);
     
      
      strokeWeight(1);
      stroke(i,80,90,distance2);
      fill(i,80,90,distance2);
      
      beginShape();
      vertex(width/2,height/2);
      vertex(x1,y1);
      vertex(x2,y2);
      endShape(CLOSE);
      
      //draw lines to make the circle have more texture
      
      stroke(360);
      strokeWeight(3);
      line(x1,y1,x2,y2);
   
   }
   
   // draw a center circle to make the wheel hollow
   stroke(360);
   strokeWeight(2);
   fill(0);
   ellipse(width/2, height/2,150,150);
   
   
    for (int i = 0; i < 360; i += 20) {
    float x3 = 200*sin(radians(i))+width/2;
    float y3 = 200*cos(radians(i))+height/2;
    
    float distance3 = dist(x,y,x3,y3);
    distance3 = map(distance3,0,200,100,-20);
    
    noStroke();
    fill(i, 80, 90, distance3);
    ellipse(x3, y3,distance3,distance3);
  }
  if(mousePressed){
      checkState();
      soundRate = random(0.6,1.5);
      
  }
   
 }
   
void checkState(){
    if(dist(x,y,width/2,height/2)<200 && dist(x,y,width/2,height/2)>150){
    display1();  
  }else{
    display2();
  }
}

void display1(){
    sound1.rate(soundRate);
      clr = get(mouseX,mouseY);
      
      stroke(clr);
      strokeWeight(3);
      noFill();
      float y=random(1000);
      beginShape();
      for(float x=0; x<width; x+=10){
      float z = (noise(x/100,y/300,random(1000))-0.2)*600;
      vertex(x,y+z);
      }
      endShape();
     
     
}

void display2(){
  
 int rectSize = 1;
  for (int y =0; y <height; y+=rectSize){
  for(int x =0; x<width; x+=rectSize){
    //float y = height*1/3;
  fill(random(255),90);
  rect(x,y,rectSize,rectSize);
}
  }
}