Project ‘Salam’ – Final Project for Interaction Lab

Abdullah Mobeen

Professor Marcela Godoy

Interaction Lab

Project Salam – Final Project

Reminiscing over this semester of interactive learning, I remember how I technical I thought Interaction Lab is. Mostly due to my dislike for electricity beginning from my high school days. Now that I compare my stupid pet project and my final project, I realize that I actually did learn a lot this semester. In this post, I’m going to guide you through the 0s and 1s of my project and will reflect upon my efforts to complete the project. Before you read the detailed implementation, I would like you to watch the video of my project so that you know what are you staying on this post for.

Background:

I remember having a chat with Rudi (Professor Intro to Robotics) after my Interaction Lab midterm project when he said that a great project is the one that carries value and moves the users. So I decided to work on something along a social line that will help create awareness regarding a prevalent issue.

I named my project ‘Salam’, which literally means peace in Arabic. The idea was to model a virtual environment in which a user can get a glimpse of a refugee’s struggle while fleeing a war-stricken land. I chose to organize my project around the journey of a Syrian refugee after having a discussion with Professor Marcela.

As a Muslim, I felt really close to the cause since I often find my faith and people who follow the same faith being questioned in the media when addressing the refugee crisis. I felt the need to do something to destigmatize the representation of a Muslim person, and I found the refugee crisis suitable for this reason. I know that it’s not that easy to destigmatize something that is being developed on baseless stereotypes for a long time, but I still decided to put my skills to a good use.

To implement this environment, I created a dark room effect and chose Kinect to detect the body gestures of the user. As soon as the user steps in front of the Kinect, his/her body gesture is detected by the Kinect, which then turns on a lamp and plays a video of Syria before the war. At this point, the user does not expect any sudden change in the environment. The user is prompted to make a peace sign and as soon as the user does that, Kinect detects his/her hand gesture and makes the lamp starts blinking. Following this, the first video stops and all of a sudden a video of Syria during the war is displayed on the screen.

I downloaded different videos and then used Adobe Premiere CC to arrange and edit the videos.

Inspiration:

Hands Up! by Roopa and Atif (NYU ITP) – a virtual environment that models the interaction between people of color and police in the light of recent events of police brutality. hands-up

Humanity Touch! (NYU ITP) – An environment that helps people interact with the computer on the basis of their identity.

Image_129

Initial Ideas: 

Even though I decided to work on a social project a long time ago, deciding on a particular idea was never easy. My first idea was to create an environment where a user can experience what it feels like to be a Muslim at an airport. I planned on using two switched such that the users press both of them with their hands and consequently trigger the code, which will then play the comments Muslims usually have to listen to while at an airport.

20170521_234535

Equipment used:

  • Arduino Uno
  • Kinect V1
  • 90W filament bulb
  • Relay

Softwares used:

  • Arduino IDE
  • Processing 3.3
  • Adobe Premiere CC

Technical Implementation:

Using Kinect was a huge challenge for me as I had never used it before, but it was one of the main components my project relied on. I explored KinectPV2 – a Kinect library on processing that encapsulates many functions of a Kinect. For me, one of the challenges was to get Kinect successfully detect not only the body but also the hand gestures. Fortunately, I found a couple of built-in functions in Kinect that made this job a lot easier, for example, KinectPV2.HandState_Lasso, KinectPV2.HandState_Closed etc. The code worked for me and began showing me the skeleton structure of a person whenever that very same person stood in front of the Kinect.

 

 

Then I edited the videos using Adobe Premiere CC and used Processing’s video library to upload all the videos. The videos were then played according to the body gesture of the user.

Screen Shot 2017-05-22 at 2.28.52 AM

For the Arduino part, Marcela helped me a lot as my knowledge of circuits is limited. I got a 90W bulb with a lamp and a relay. Marcela helped me design a new circuit with a relay that I could attach to the lamp and the Arduino to control it from my computer.

 

FXOD7OOHTVICRIN.LARGE

Finally, I combined all these functionalities using Serial Communication to give the user an immersive experience. The bulbs light up depending on the body gesture of the user and consequently, different videos play once the user triggers the Kinect with a hand gesture.

What did I learn:

In the process of building this project, I pushed myself out of my comfort zone by doing something different. Many people discouraged with to go with this idea and instead told me that if I develop a game, it would be much more straightforward. And to some extent, I agree with them. But, for me, working on something that’s close to my heart and my identity was much more important than working on something straightforward. So in short, I learnt how to take risk and do what you believe you should do.

In terms of technicalities, I learned how to make use of the ingenious processing libraries available. Initially, I tried doing everything manually only to realize that it can be too much of a work.

Watching Marcela help me with the circuit, I learned how to be creative with circuitry and also careful at the same time.

Since I had to edit the videos myself, I taught myself the basic functionalities of Adobe Premiere CC and edited my videos as well as audios and then added them in my project.

What did I fail to achieve:

I failed to come up with a way to prompt my user to make the hand gesture upon which the Kinect triggers. In my project, I cut a piece of paper and put that front of a lamp I’m using to cast the silhouette of the hand gesture on the floor. This is how the user is asked to make the hand gesture. However, I wanted something much more subtle that could make the user follow instructions easily.

What did I successfully achieve?

I successfully made the user interact with a computer in an ingenious way and bring out the emotion of surprise from the user as soon as the user triggers the virtual environment. I feel proud of the fact that I was able to pull it off and move my users. During the presentation, I got many positive comments from other fellows.

Future Development:

This entire idea of interaction for social cause could be scaled to many different scenarios modeling the experience of many different identities that are oppressed in the society. I will work on the different ways I can prompt the user to make certain body gestures and I will definitely try different physical settings such as flashlights to create an even stronger effect. I would also use the camera from processing to see if that could help me get a better way to prompt the user to make certain body gestures.

Arduino Code:

char val; // Data received from the serial port
int ledPin = 13; // Set the pin to digital I/O 13

void setup() {
pinMode(ledPin, OUTPUT); // Set pin as OUTPUT
Serial.begin(9600); // Start serial communication at 9600 bps
}

void loop() {
while (Serial.available())
{ // If data is available to read,
val = Serial.read(); // read it and store it in val
}
if (val == ‘2’)
{
digitalWrite(13, HIGH); // turn the LED on and off
delay(100);
digitalWrite(13, LOW);
delay(100);

} else if (val == ‘1’)
{
digitalWrite(13, HIGH); // keep it on
} else if(val == ‘0’)
{
digitalWrite(13, LOW); // keep it off
} else if(val == ‘3’){
digitalWrite(13, HIGH); // keep it on
delay(5000);
digitalWrite(13, LOW); // keep it off
val = ‘546’;
} else {
digitalWrite(13, LOW); // keep it off
}

}

import KinectPV2.KJoint; 
import KinectPV2.*; 
import processing.video.*; 
import processing.serial.*; 



KinectPV2 kinect;
Movie mv_refugee;
Movie mv_happy;
Serial myPort;

//pl = false;
boolean h = true;
boolean r = false;

public void setup() {
  //size(1920, 1080);
  
  kinect = new KinectPV2(this);

  kinect.enableSkeletonColorMap(true);
  kinect.enableColorImg(true);

  kinect.init();

  mv_refugee = new Movie(this, "refugee.mp4");
  //mv_refugee.play();
  mv_happy = new Movie(this, "syria.mp4");
  //mv_refugee.play();
  imageMode(CENTER);
  
  String portName = Serial.list()[0]; //change the 0 to a 1 or 2 etc. to match your port
  myPort = new Serial(this, portName, 9600);
}

public void draw() {
  background(0);

  //image(kinect.getColorImage(), 0, 0, width, height);

  ArrayList<KSkeleton> skeletonArray =  kinect.getSkeletonColorMap();

  //individual JOINTS
  for (int i = 0; i < skeletonArray.size(); i++) {
    KSkeleton skeleton = (KSkeleton) skeletonArray.get(i);
    if (skeleton.isTracked()) {//when kinect see peopele

      //light up bulb
      KJoint[] joints = skeleton.getJoints();

      int col  = skeleton.getIndexColor();
      fill(col);
      stroke(col);
      drawBody(joints);

      //draw different color for each hand state
      drawHandState(joints[KinectPV2.JointType_HandRight]);
      drawHandState(joints[KinectPV2.JointType_HandLeft]);
    }
  }

  fill(255, 0, 0);
  text(frameRate, 50, 50);

  if ( r==true) {
    myPort.write('2');
    fill(0, 0, 255);
    mv_refugee.play();
    image(mv_refugee, width/2, height/2, width, height); // (img, x,y, w,h)
  }
  if (r && mv_refugee.time() >= mv_refugee.duration()) { //movie must be finished
    r=false;
    myPort.write('3');
    println("3");
    exit();
  }
}

//DRAW BODY
public void drawBody(KJoint[] joints) {
  //play happy video
  //switch(pl)
  if (h == true) {
    myPort.write('1');
    println('1');
    fill(0, 0, 255);//========================
    mv_happy.play();
    image(mv_happy, width/2, height/2, width, height);
    mv_happy.loop();

    drawBone(joints, KinectPV2.JointType_Head, KinectPV2.JointType_Neck);
    drawBone(joints, KinectPV2.JointType_Neck, KinectPV2.JointType_SpineShoulder);
    drawBone(joints, KinectPV2.JointType_SpineShoulder, KinectPV2.JointType_SpineMid);
    drawBone(joints, KinectPV2.JointType_SpineMid, KinectPV2.JointType_SpineBase);
    drawBone(joints, KinectPV2.JointType_SpineShoulder, KinectPV2.JointType_ShoulderRight);
    drawBone(joints, KinectPV2.JointType_SpineShoulder, KinectPV2.JointType_ShoulderLeft);
    drawBone(joints, KinectPV2.JointType_SpineBase, KinectPV2.JointType_HipRight);
    drawBone(joints, KinectPV2.JointType_SpineBase, KinectPV2.JointType_HipLeft);

    // Right Arm
    drawBone(joints, KinectPV2.JointType_ShoulderRight, KinectPV2.JointType_ElbowRight);
    drawBone(joints, KinectPV2.JointType_ElbowRight, KinectPV2.JointType_WristRight);
    drawBone(joints, KinectPV2.JointType_WristRight, KinectPV2.JointType_HandRight);
    drawBone(joints, KinectPV2.JointType_HandRight, KinectPV2.JointType_HandTipRight);
    drawBone(joints, KinectPV2.JointType_WristRight, KinectPV2.JointType_ThumbRight);

    // Left Arm
    drawBone(joints, KinectPV2.JointType_ShoulderLeft, KinectPV2.JointType_ElbowLeft);
    drawBone(joints, KinectPV2.JointType_ElbowLeft, KinectPV2.JointType_WristLeft);
    drawBone(joints, KinectPV2.JointType_WristLeft, KinectPV2.JointType_HandLeft);
    drawBone(joints, KinectPV2.JointType_HandLeft, KinectPV2.JointType_HandTipLeft);
    drawBone(joints, KinectPV2.JointType_WristLeft, KinectPV2.JointType_ThumbLeft);

    // Right Leg
    drawBone(joints, KinectPV2.JointType_HipRight, KinectPV2.JointType_KneeRight);
    drawBone(joints, KinectPV2.JointType_KneeRight, KinectPV2.JointType_AnkleRight);
    drawBone(joints, KinectPV2.JointType_AnkleRight, KinectPV2.JointType_FootRight);

    // Left Leg
    drawBone(joints, KinectPV2.JointType_HipLeft, KinectPV2.JointType_KneeLeft);
    drawBone(joints, KinectPV2.JointType_KneeLeft, KinectPV2.JointType_AnkleLeft);
    drawBone(joints, KinectPV2.JointType_AnkleLeft, KinectPV2.JointType_FootLeft);

    drawJoint(joints, KinectPV2.JointType_HandTipLeft);
    drawJoint(joints, KinectPV2.JointType_HandTipRight);
    drawJoint(joints, KinectPV2.JointType_FootLeft);
    drawJoint(joints, KinectPV2.JointType_FootRight);

    drawJoint(joints, KinectPV2.JointType_ThumbLeft);
    drawJoint(joints, KinectPV2.JointType_ThumbRight);

    drawJoint(joints, KinectPV2.JointType_Head);
  } else {
    mv_happy.stop();
  }
}
//draw joint
public void drawJoint(KJoint[] joints, int jointType) {
  pushMatrix();
  translate(joints[jointType].getX(), joints[jointType].getY(), joints[jointType].getZ());
  ellipse(0, 0, 25, 25);
  popMatrix();
}

//draw bone
public void drawBone(KJoint[] joints, int jointType1, int jointType2) {
  pushMatrix();
  translate(joints[jointType1].getX(), joints[jointType1].getY(), joints[jointType1].getZ());
  ellipse(0, 0, 25, 25);
  popMatrix();
  line(joints[jointType1].getX(), joints[jointType1].getY(), joints[jointType1].getZ(), joints[jointType2].getX(), joints[jointType2].getY(), joints[jointType2].getZ());
}

//draw hand state
public void drawHandState(KJoint joint) {
  noStroke();
  handState(joint.getState());
  //mv_happy.stop(); //=====================
  pushMatrix();
  translate(joint.getX(), joint.getY(), joint.getZ());
  ellipse(0, 0, 70, 70);
  popMatrix();
}

/*
Different hand state
 KinectPV2.HandState_Open
 KinectPV2.HandState_Closed
 KinectPV2.HandState_Lasso
 KinectPV2.HandState_NotTracked
 */
public void handState(int handState) {
  switch(handState) {
    /*default:
     //play happy video
     fill(0, 0, 255);
     mv_happy.play();
     image(mv_happy, 0, 0, width*1.25,height*1.25);
     mv_happy.loop();
     break; */
    //case KinectPV2.HandState_Open:
    //  fill(0, 255, 0);
    //  break;
    //case KinectPV2.HandState_Closed:
    //  fill(255, 0, 0);
    //  break;
  case KinectPV2.HandState_Lasso:
    h = false;
    r = true;
    //fill(0, 0, 255);
    //mv_refugee.play();
    //image(mv_refugee, width/2,height/2,width,height); // (img, x,y, w,h)

    //win gesture
    //do it here
    //play sad video

    //case KinectPV2.HandState_NotTracked:
    //  fill(255, 255, 255);
    //  break;
  }
}


public void movieEvent(Movie m) {
  m.read();
}

One thought on “Project ‘Salam’ – Final Project for Interaction Lab

  1. Abdullah,

    I sent you my feedback on my email, but congrats again for the great jump you did from your midterm to your final project. I’m glad that you are happy with the results and that you were able to push yourself outside your comfort zone and do a great job.

    Have a great break!

Leave a Reply