ToyDesign_Soft Pattern exercise.

For our in-class exercise, I thought to would be fun to create something that could be built in parts.  In other words, I soft toy that would have to require inital round of sewing, stuffing and then some more sewing to bind the parts together.  As most of the projects in class seem to have been solely 2D patterns.  Making something more 3D could be cool!

I decided on creating a carnivorous plant as it would be relatively simple parts that can be put together to achieve a 3D like effect.

 

 

 

Above is a rough sketch of the patterns used for the plant.  the leaves are to be cut 4 times.  The long dome sections for the jaws were the hardest to approach as the fabric I chose was thick and very fluffy and I had 6 to prep( dewing this was a huge pain!) The stem was super easy, though turning it inside out proved to be rather challenging. As for the inner jaw, my favorite part,  i measured the inside with a wire and cut according to the radius of the said wire.

 

to be put together

Robotics_Deliverable5

 

Detailed Documentation: lucibufagens ( working title)

I want to begin with that feeling most people get when they see some thing out of the corner of your eye. In particular something small dark, fast and close to the ground. I often compare that feeling to anxiousness and phobia. Your body stiffens-up your brain is are on alert, expecting the worst. Out of the corner of your other eye, zooms the little spec into plain sight! Heart rates skyrocket!

I find it interesting that a large percentage of people fear pests or creepy-crawlies in some what or another. A close friend has a very low a tolerance for creepy crawlies—Ants in particular.  Ants simply freak my friend out. So much so that the presence of an ant will cause them to evacuate the room until the pest is disposed of –ideally in a humane or non-harming manner.

But why are people frightened by these tiny creatures? First hand experience, texture, structure, or even psychological cues? I am not sure why.  Personally, I never found them to be very frightening, but every so and then; especially when its dark—BAM—my heart begins to pound!

On a different note, I find the bioluminescence from fireflies to be aw-inspiring. Up until recently,  I never actually experienced a firefly in real life. I had no idea they where actually a species of beetles. I was blown away when I saw them in person.  The very first time I saw one, I immediately wanted to hold one, I was so drawn-in. There was a moment in particular after I had reached out to catch one, where I realized that I was getting nervous was experiencing the very same sensation on gets when they see a creepy crawly. Thus I panicked and the bug flew away.

How is it that one can be attracted and yet terrified by a tiny light in the dark?

My friend would say that I found beauty in something unpleasant.

This project hopes to further illustrate the sensation I got when interacting with the fireflies. By creating a robotic installation mimicking the behaviors of common household bugs.

By creating a swarm tiny robots. I hope to replicate the common phobia of creepy-crawlies. At the same time I hope to use the swarm (or colony ) to reveal the inserting paths and strides the swarm builds.  I hope to do so by equip each bug with a smart LED that will emit its designated color at certain times in order to emulate different pixels in an image.  Duing the run time, a long exposure photograph  will capture the “light paintings” the bug-bots create. Perhaps we will find beauty in these pests.

[edits to be added]

Constrains:

Size- for this project to fully emulate bugs,each bot will need to be as small as possible. I don’t think the same effects will occur with large bots.

Power-Powering such a small device will be rather difficult. Chips like the attiny can only handle 5V.

Memory-Programming on a attiny85 will be difficult due to the fact the chip can only hold ~~8 KB  of program, Thus the program will have to be as efficient as possible.

–unexplored territory

List of Parts

Bot

Attiny85

-attiny programing shield (Nick has built a custom one)

-attiny prototyping USB

Sensors and electronics

-Light sensor ( scatter effect, similar to roaches)

-IRC ( Distance/ bug identifier) ßà Ultrasonic ranger ßà flexie meter( bumper car style)

-Bluetooth module (control: as simple as start and stop)

-smartLED ( luminescence )

-tiny motors ( locomotion)

Installation:

-Webcam

-Tripod

-DSLR

-dim room

 

Some Documentation:

{ i forgot to pick-up jumper cables…}

 

Freshly soldered “kinderBot” ( with motors, jumper cables and hot glue..ty Rudi)

 

AttinyShields and Attiny

attiny close-up ( finger in frame for scale comparison)

 

Sensors: Bluetooth,IR,Ultrasonic

 

 

 

Video: Nick and I had a programming session. Very informative.

 

 

 

 

 

Rough Software Diagrams and Notes

 

 

 

Robotics_Assignment6

Robotic-Fish

Initial research in 1989 with the robo-tuna by MIT.

https://en.wikipedia.org/wiki/RoboTuna

Made to imitate the locomotion of actual fish using Body-Caudal Fin Propulsion. There have been hundreds of articles on robotic fish.  Only a few research projects have been capable of full movement.

Initial projects primarily studied locomotion. In contrast, today most research efforts have been toward controlling behaviors, such as communication and navigation.

Quick images search:

https://www.google.com.hk/search?q=robotic+fish&tbm=isch&tbo=u&source=univ&sa=X&ved=0ahUKEwiB-sytwvPWAhUGn5QKHU9RCCAQ7AkIPw#imgrc=2wejvtIzecHSSM

 

http://www.robotic-fish.net/

Advances and uses outside of research:

Environmental sensing and exploration

https://www.sciencedaily.com/releases/2017/05/170510091553.htm

https://www.roboshoal.com/

https://www.designboom.com/technology/pollution-detecting-robotic-fish/

Many levels of fidelity with the Robo-fish:

https://www.wereblog.com/japanese-invents-life-like-robotic-fish

Robot fish in Maihama Tokyo Bay-

https://www.youtube.com/watch?v=wiK5fxV7ycI

Fish with lasers in South Korea

https://www.youtube.com/watch?v=-wBJPKiaxyg

Advancements at MIT

http://news.mit.edu/2014/soft-robotic-fish-moves-like-the-real-thing-0313

Robot fish toy

https://www.youtube.com/watch?v=31E8ywyUCrw

Robotics_concept_testing

For my final, I intend to create a proto-swarm of small robots. Ideally, the bots are to be as small as possible; perhaps smaller than 10cm^3  per bot.  Creating such a small “creatures” will prove to be very challenging–all while simulating insect behaviors.

To quickly summarize,: observers will be given to the opportunity to see the natural beauty that are swarms of insects. Via purposeful movements and group communication, the swarm is to explore a dark terrain which will then be mapped out( with the help of on-board smart LEDs) via long exposure photography-or “light paintings”. A keen eye might notice that the paintings will represent an image of some kind. Whether pre-determined images or live footage is to be determined at this stage.

After bouncing a few ideas around with IMA fellow Nick we came to the conclusion that an ATtiny board. We noted a few disadvantages: low power and low memory. Efficiency will be a major design factor!

Locomotion became another hot topic, as ATtiny boards are not capable of delivering enough power for wheels with high friction coefficients

The idea of having vibration motors was considered as a workaround, however, stability and controllability became a  major concern.

 

Rudi and I spoke about many issues these bots could have as well as many insights as to how to approach the bots as efficiently as possible. More on that soon.

as for locomotion.

Tiny DC motors became a worthy solution item -using their shafts to directly propel the bot.

On to testing.

Rough motor placements–


Actual test with super light chassis–this is also one the most annoying things I have ever made.

All seems to work, it is a matter of motor angles and wire connection. Further testing with different chassis are a must.

 

Pintbot, Winkbot, Kilobot and a few others are a few robots that have similar hardware. Most prominently the motor shaft to ground solution for locomotion.

Robotics_D4

The Physical Stucture

 

Mirrors aside–for now.

Bugs are weird.

Robotic bugs would be also weird. but kinda neat.
Ideally, the bot would resemble an insect’s movements, physical appearance and possibly behaviors as much as possible.  Hence the long body and two primary motors for locomotion.

Robotics_A5

“From curious to clever, persistent to playful, he has personality x 10. He knows your name, face, and quirks. And best of all, he continues to evolve the more you hang out.”

Comzo a Pixar character in real life!

The tiny bot is designed to have as much personality as possible. Cozmo’s primary functions are environment recognition and facial recognition; in other words, Cozmo is actively learning how to recognize the world around it( computer vision). Apart from that, Cozmo seems to also learn via audio recognition and human-interactions as well as problem-solving- thus making it a surprisingly smart AI or at least that’s what the marketing team has to lead us to believe.

~45 iterations to create the final product

Product info video

Constant app updates to further Cozmo’s capabilities.

Lastest updated allowed users to program Comzo via a visual programming app.

So far Cozmo is marketed towards kids from ages 8-18 as most of its interaction/functions are game/play centered. But as the project continues to evolve it might become aimed toward adults or even professionals.

$170.00 usd

Thought: what if Cozmo was open-sourced?

Assignment 4b: WFRII

It was very interesting to read about a simple idea, especially when it involves many years of development. The flutist robot first started in the early 80’s, and eventually was “completed” in the late 2000’s. Initially, the robot was a simple idea, and had a simple function: blow air and product sound. But as development continued, every year was dedicated to mimic a human flutist. This comes to show that even the simplest ideas after so many years of thought and perseverance, can realize into a complex final product. Even if the product is “finalized” there are still many areas that can be improved upon.

It comes to show that it is best for us as designer/engineers to focus on small tasks at a time in order to complete a large scale project with many moving parts. The focus is key.

Assignment 4a: Final Scenario

Scenario for final:
Enter the dimly lit room.

Opposite of the entrance is a large “grid” of circular mirrors(6×4). Each mirror is also equipped with warm colored LEDs (perhaps on the front or on back of the mirrors).   The grid reacts to the presence of a person by individually swiveling each mirror via 2-DOF yaw or pitch, thus positioning each mirror’s face in a specific direction.

A keen eye will notice each mirror initially points away from the center of the grid, creating convex shape, similar to a fisheye. Once the grid detects a person, each mirror does its best to create a concave shape will the focal point as the person’s face.

Why is it doing this? What is its purpose?

The initial position of the mirrors and their reaction position(s) are mere repetitions of how humans often perceive things, especially when there is very little light.

From a distance, we can see dim reflections of the world around us.  As close the gap between us and the world around us we find only ourselves in a dim environment. The question is, can we really only see ourselves, what about the rest of the world?  Where should our focus be during dark times?

super rough sketch

Deliverable 3: N-DOF Robot

make an N-DOF robot. Document it’s software both with a flow chart and the code itself. State in your documentation where would you use filters or PID control and how would you tune them.

 

Ziggy with (2-3) DOF

New Parts

A Gif of some Ultrasonic work

For a PID system, we decided it would be best that all of Ziggy’s motor and drive functions should have a control algorithm to adjust acceleration differences due to battery consumption, terrain, and possible payloads( much like the elevator example from class). As for a smoother, its addition to the servo wouldn’t hurt; especially on start-up when the servo flicks to 10degrees.

Rough Flowchart

 

 

 

 

 

CODE BELOW:

//LATE_NIGHT TWEEKS

//inspiration from //
// http://www.educ8s.tv

 

 

#include <Servo.h>
Servo servo;
int angle = 10;

int speedPin_M1 = 5; //M1 Speed Control
int speedPin_M2 = 6; //M2 Speed Control
int directionPin_M1 = 4; //M1 Direction Control
int directionPin_M2 = 7; //M1 Direction Control

const int pingPin = 9;
#define trigPin 12
#define echoPin 13

int distance = 100;

 

void setup() {

servo.attach(3);
servo.write(angle);
delay(2000);

Serial.begin(19200);
Serial.begin (9600);
pinMode(trigPin, OUTPUT);
pinMode(echoPin, INPUT);
}

void loop() {

int distanceR = 0;
int distanceL = 0;
delay(40);

 

 

if(distance<=15)
{
carStop();
delay(100);
carBack(150,150);
delay(300);
carStop();
delay(200);
distanceR = lookRight();
delay(200);
distanceL = lookLeft();
delay(200);

if(distanceR>=distanceL)
{
carTurnRight(250, 0);

}else
{
carTurnLeft(0, 250);

}
}else
{
carAdvance(150,150);
}

//servo

// for(angle = 10; angle < 180; angle++)
// {
// servo.write(angle);
// delay(15);
// }
// // now scan back from 180 to 0 degrees
// for(angle = 180; angle > 10; angle–)
// {
// servo.write(angle);
// delay(15);
// }

//Ultra_1(top)

 

// establish variables for duration of the ping, and the distance result
// in inches and centimeters:
long duration2, inches, cm;

// The PING))) is triggered by a HIGH pulse of 2 or more microseconds.
// Give a short LOW pulse beforehand to ensure a clean HIGH pulse:
pinMode(pingPin, OUTPUT);
digitalWrite(pingPin, LOW);
delayMicroseconds(2);
digitalWrite(pingPin, HIGH);
delayMicroseconds(5);
digitalWrite(pingPin, LOW);

// The same pin is used to read the signal from the PING))): a HIGH pulse
// whose duration is the time (in microseconds) from the sending of the ping
// to the reception of its echo off of an object.
pinMode(pingPin, INPUT);
duration2 = pulseIn(pingPin, HIGH);

// convert the time into a distance
inches = microsecondsToInches(duration2);
// cm = microsecondsToCentimeters(duration2);

Serial.print(inches);
Serial.print(“in, “);
Serial.print(cm);
Serial.print(“cm_1”);
Serial.println();

delay(100);

 

if(inches >= 31){
carAdvance(150,150);
}
else{
carAdvance(250,250);
}

 

 

long microsecondsToInches(long microseconds) {
// According to Parallax’s datasheet for the PING))), there are 73.746
// microseconds per inch (i.e. sound travels at 1130 feet per second).
// This gives the distance travelled by the ping, outbound and return,
// so we divide by 2 to get the distance of the obstacle.
// See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds) {
// The speed of sound is 340 m/s or 29 microseconds per centimeter.
// The ping travels out and back, so to find the distance of the object we
// take half of the distance travelled.
return microseconds / 29 / 2;
}

 

 

///servo

int lookRight()
{
servo.write(50);
delay(500);

//Ultra Sound_2(front)
//mathfor example
long duration, distance;
digitalWrite(trigPin, LOW); // Added this line
delayMicroseconds(2); // Added this line
digitalWrite(trigPin, HIGH);
// delayMicroseconds(1000); – Removed this line
delayMicroseconds(10); // Added this line
digitalWrite(trigPin, LOW);
duration = pulseIn(echoPin, HIGH);
distance = (duration/2) / 29.1;

 

if (distance >= 200 || distance <= 0){
Serial.println(“Out of range”);
}
else {
Serial.print(distance);
Serial.println(” cm_2″);
}
delay(500);
}

delay(100);
servo.write(115);
return distance;
}
int lookLeft()
{
servo.write(170);
delay(500);

//Ultra Sound_2(front)
//mathfor example
long duration, distance;
digitalWrite(trigPin, LOW); // Added this line
delayMicroseconds(2); // Added this line
digitalWrite(trigPin, HIGH);
// delayMicroseconds(1000); – Removed this line
delayMicroseconds(10); // Added this line
digitalWrite(trigPin, LOW);
duration = pulseIn(echoPin, HIGH);
distance = (duration/2) / 29.1;

 

if (distance >= 200 || distance <= 0){
Serial.println(“Out of range”);
}
else {
Serial.print(distance);
Serial.println(” cm_2″);
}
delay(500);
}

delay(100);
servo.write(115);
return distance;
delay(100);
}

 

 

///carStuff

void carStop() { // Motor Stop
digitalWrite(directionPin_M2, 0);
digitalWrite(directionPin_M1, LOW);
digitalWrite(speedPin_M1, 0);
digitalWrite(directionPin_M2, LOW);
}
void carBack(int leftSpeed, int rightSpeed) { //Move backward
analogWrite (speedPin_M2, leftSpeed); //PWM Speed Control
digitalWrite(directionPin_M1, HIGH);
analogWrite (speedPin_M1, rightSpeed);
digitalWrite(directionPin_M2, HIGH);
}
void carAdvance(int leftSpeed, int rightSpeed) { //Move forward
analogWrite (speedPin_M2, leftSpeed);
digitalWrite(directionPin_M1, LOW);
analogWrite (speedPin_M1, rightSpeed);
digitalWrite(directionPin_M2, LOW);
}
void carTurnLeft(int leftSpeed, int rightSpeed) { //Turn Left
analogWrite (speedPin_M2, leftSpeed);
digitalWrite(directionPin_M1, LOW);
analogWrite (speedPin_M1, rightSpeed);
digitalWrite(directionPin_M2, HIGH);
}
void carTurnRight(int leftSpeed, int rightSpeed) { //Turn Right
analogWrite (speedPin_M2, leftSpeed);
digitalWrite(directionPin_M1, HIGH);
analogWrite (speedPin_M1, rightSpeed);
digitalWrite(directionPin_M2, LOW);
}

#include <Servo.h>
Servo servo;
int angle = 10;

int speedPin_M1 = 5;     //M1 Speed Control
int speedPin_M2 = 6;     //M2 Speed Control
int directionPin_M1 = 4;     //M1 Direction Control
int directionPin_M2 = 7;     //M1 Direction Control

const int pingPin = 9;
#define trigPin 12
#define echoPin 13

void setup() {

  servo.attach(3);
  servo.write(angle);
  // put your setup code here, to run once:
  Serial.begin(19200);
  Serial.begin (9600);
  pinMode(trigPin, OUTPUT);
  pinMode(echoPin, INPUT);
}

void loop() {




 //servo

  for(angle = 10; angle < 180; angle++)  
  {                                  
    servo.write(angle);               
    delay(15);                   
  } 
  // now scan back from 180 to 0 degrees
  for(angle = 180; angle > 10; angle--)    
  {                                
    servo.write(angle);           
    delay(15);       
  } 

//Ultra_1(top)


  // establish variables for duration of the ping, and the distance result
  // in inches and centimeters:
  long duration2, inches, cm;

  // The PING))) is triggered by a HIGH pulse of 2 or more microseconds.
  // Give a short LOW pulse beforehand to ensure a clean HIGH pulse:
  pinMode(pingPin, OUTPUT);
  digitalWrite(pingPin, LOW);
  delayMicroseconds(2);
  digitalWrite(pingPin, HIGH);
  delayMicroseconds(5);
  digitalWrite(pingPin, LOW);

  // The same pin is used to read the signal from the PING))): a HIGH pulse
  // whose duration is the time (in microseconds) from the sending of the ping
  // to the reception of its echo off of an object.
  pinMode(pingPin, INPUT);
  duration2 = pulseIn(pingPin, HIGH);

  // convert the time into a distance
  inches = microsecondsToInches(duration2);
  cm = microsecondsToCentimeters(duration2);

  Serial.print(inches);
  Serial.print("in, ");
  Serial.print(cm);
  Serial.print("cm_1");
  Serial.println();

  delay(100);





  
 //Ultra Sound_2(front)

  long duration, distance;
  digitalWrite(trigPin, LOW);  // Added this line
  delayMicroseconds(2); // Added this line
  digitalWrite(trigPin, HIGH);
//  delayMicroseconds(1000); - Removed this line
  delayMicroseconds(10); // Added this line
  digitalWrite(trigPin, LOW);
  duration = pulseIn(echoPin, HIGH);
  distance = (duration/2) / 29.1;


  if (distance >= 200 || distance <= 0){
    Serial.println("Out of range");
  }
  else {
    Serial.print(distance);
    Serial.println(" cm_2");
  }
  delay(500);
}







long microsecondsToInches(long microseconds) {
  // According to Parallax's datasheet for the PING))), there are 73.746
  // microseconds per inch (i.e. sound travels at 1130 feet per second).
  // This gives the distance travelled by the ping, outbound and return,
  // so we divide by 2 to get the distance of the obstacle.
  // See: http://www.parallax.com/dl/docs/prod/acc/28015-PING-v1.3.pdf
  return microseconds / 74 / 2;
}

long microsecondsToCentimeters(long microseconds) {
  // The speed of sound is 340 m/s or 29 microseconds per centimeter.
  // The ping travels out and back, so to find the distance of the object we
  // take half of the distance travelled.
  return microseconds / 29 / 2;
}

///carStuff

void carStop() {                //  Motor Stop
  digitalWrite(directionPin_M2, 0);
  digitalWrite(directionPin_M1, LOW);
  digitalWrite(speedPin_M1, 0);
  digitalWrite(directionPin_M2, LOW);
}
void carBack(int leftSpeed, int rightSpeed) {       //Move backward
  analogWrite (speedPin_M2, leftSpeed);             //PWM Speed Control
  digitalWrite(directionPin_M1, HIGH);
  analogWrite (speedPin_M1, rightSpeed);
  digitalWrite(directionPin_M2, HIGH);
}
void carAdvance(int leftSpeed, int rightSpeed) {     //Move forward
  analogWrite (speedPin_M2, leftSpeed);
  digitalWrite(directionPin_M1, LOW);
  analogWrite (speedPin_M1, rightSpeed);
  digitalWrite(directionPin_M2, LOW);
}
void carTurnLeft(int leftSpeed, int rightSpeed) {    //Turn Left
  analogWrite (speedPin_M2, leftSpeed);
  digitalWrite(directionPin_M1, LOW);
  analogWrite (speedPin_M1, rightSpeed);
  digitalWrite(directionPin_M2, HIGH);
}
void carTurnRight(int leftSpeed, int rightSpeed) {    //Turn Right
  analogWrite (speedPin_M2, leftSpeed);
  digitalWrite(directionPin_M1, HIGH);
  analogWrite (speedPin_M1, rightSpeed);
  digitalWrite(directionPin_M2, LOW);
}

vr and edgy!

Oogie (rift)

“Oogie is an exciting new kind of exploration game – the interactive documentary.”  about an adventurous beetle.

6X9(rift/gear)

A VR simulation of solitary confinement

Adr1ft

“ADR1FT is an immersive FPX (First-Person Experience) that tells the story of an astronaut in peril. Floating silently amongst the wreckage of a destroyed space station with no memory and a damaged EVA suit. The sole survivor struggles to determine the cause of the catastrophic event, stay alive and return home safely to Earth.”

 

 

https://8i.com/

volumetric-video in vr