Deliverable 5: Final Update

Final Project Description: Generally, It is a DJ Robot. It’s a Robot dancing and adding different tracks of music at the same time. I will put different RFID cards at different locations, and I will provide many black line people can design their own tracks to connect RFID cards together, people can choose which cards and how many cards they want to choose, People don’t have to choose all the cards.  Continue reading

Deliverable 5:-)

Jerry the Snail- loves to eat and sleep–sounds relatable

For ages 8-80 years old

For my final robot, I want to create a pet snail that assimilates a real snail by capturing the actions of an actual snail. I took a look at some snail videos to try to observe what they do and how they go about their day and it was really interesting and really helpful to integrate into the concept and execution of my project. From the videos I saw on Youtube they are very slow creatures who have nothing else on their agenda but to eat.

I began by designing the shell of my snail which is a classic snail design. I was struggling in the labs to get it just right because of complications but I went to talk to Rudi and felt reassured to go for it and to not take no for an answer. Upon completing the 3D printing of my shell I decided to begin with the baseboard of the pet. I kept the bottom a simple structure and attached 4 wheels with traction, 4 DC motors, a mini servo, and batteries. I then covered it with a cloth to make it look like the body of a snail. The sensors I have used are ultrasound to measure the distance and to make sure he does not run into a wall or an object about 15 cm in front of it, an infrared sensor with LED lights to detect his environment, and a color sensor to detect a piece of lettuce as he moves closer to about 10 cm in front of him.

When testing, I was able to have Jerry move back and forth and detect his environment. He is slow and steady, but hey slow and steady wins the race!! He also reacts to the lights being turned off and on because of the infrared sensors. The only problem I ran into was placing all the hardware inside of Jerry and also figuring out the color sensor and how he eats lettuce, but I’m not giving up and Jerry will soon have his meal of delicious greens.

Robotics_Deliverable5

 

Detailed Documentation: lucibufagens ( working title)

I want to begin with that feeling most people get then they see some thing out of the corner of your eye. In particular something small dark, fast and close to the ground. I often compare that feeling to anxiousness and phobia. Your body stiffens-up your brain is are on alert, expecting the worst. Out of the corner of your other eye, zooms the little spec into plain sight! Heart rates skyrocket!

I find it interesting that a large percentage of people fear pests or creepy-crawlies in some what or another. A close friend has a very low a tolerance for creepy crawlies—Ants in particular.  Ants simply freak my friend out. So much so that the presence of an ant will cause them to evacuate the room until the pest is disposed of –ideally in a humane or non-harming manner.

But why are people frightened by these tiny creatures? First hand experience, texture, structure, or even psychological cues? I am not sure why.  Personally, I never found them to be very frightening, but every so and then; especially when its dark—BAM—my heart begins to pound!

On a different note, I find the bioluminescence from fireflies to be aw-inspiring. Up until recently,  I never actually experienced a firefly in real life. I had no idea they where actually a species of beetles. I was blown away when I saw them in person.  The very first time I saw one, I immediately wanted to hold one, I was so drawn-in. There was a moment in particular after I had reached out to catch one, where I realized that I was getting nervous was experiencing the very same sensation on gets when they see a creepy crawly. Thus I panicked and the bug flew away.

How is it that one can be attracted and yet terrified by a tiny light in the dark?

My friend would say that I found beauty in something unpleasant.

This project hopes to further illustrate the sensation I got when interacting with the fireflies. By creating a robotic installation mimicking the behaviors of common household bugs.

By creating a swarm tiny robots. I hope to replicate the common phobia of creepy-crawlies. At the the same time I hope to use the swarm (or colony ) to reveal the inserting paths and strides the swarm builds.  I hope to do so by equip each bug with a smart LED that will emit its designated color at certain times in order to emulate different pixels in an image.  Duing the run time, a long exposure photograph  will capture the “light paintings” the bug-bots create. Perhaps we will find beauty in these pests.

[edits to be added]

Constrains:

Size- for this project to fully emulate bugs,each bot will need to be as small as possible. I don’t think the same effects will occur with large bots.

Power-Powering such a small device will be rather difficult. Chips like the attiny can only handle 5V.

Memory-Programming on a attiny85 will be difficult due to the fact the chip can only hold ~~8 KB  of program, Thus the program will have to be as efficient as possible.

–unexplored territory

List of Parts

Bot

Attiny85

-attiny programing shield (Nick has built a custom one)

-attiny prototyping USB

Sensors and electronics

-Light sensor ( scatter effect, similar to roaches)

-IRC ( Distance/ bug identifier) ßà Ultrasonic ranger ßà flexie meter( bumper car style)

-Bluetooth module (control: as simple as start and stop)

-smartLED ( luminescence )

-tiny motors ( locomotion)

Installation:

-Webcam

-Tripod

-DSLR

-dim room

 

Some Documentation:

{ i forgot to pick-up jumper cables…}

 

Freshly soldered “kinderBot” ( with motors, jumper cables and hot glue..ty Rudi)

 

AttinyShields and Attiny

attiny close-up ( finger in frame for scale comparison)

 

Sensors: Bluetooth,IR,Ultrasonic

 

 

 

Video: Nick and I had a programming session. Very informative.

 

 

 

 

 

Rough Software Diagrams and Notes

 

 

 

Assignment 6

When I did a google search for bio-inspired robots I came across Harvard University’s first autonomous entirely soft robot. It is powered by a chemical reaction controlled by microfluidics and has no electronics. The project began at the Harvard school of engineering in 2016 when a group of researchers came together to give soft robotics a try. I found it very interesting because it is the beginning of what could be the future of replacing hardware and rigid boards and batteries for robots that are entirely soft. The octobot contains fuel storage and is powered by gas under pressure. It has the same amount of dexterity as an actual octopus and can perform similar movements. Inside the octobot is liquid fuel (hydrogen peroxide) which transforms which transforms into a large amount of gas that flows into octobots arms and inflates them.  All it takes is soft lithography, molding, and 3D printing. It may not perform many tasks but it assimilates an octopus physically and internally with no hard or rigid components. I think this is cool to note that this is the future and that hardware has the potential to be soft and to fully embrace the versatility as if it weren’t a robot anymore.

Robotics_Assignment6

Robotic-Fish

Initial research in 1989 with the robo-tuna by MIT.

https://en.wikipedia.org/wiki/RoboTuna

Made to imitate the locomotion of actual fish using Body-Caudal Fin Propulsion. There have been hundreds of articles on robotic fish.  Only a few research projects have been capable of full movement.

Initial projects primarily studied locomotion. In contrast, today most research efforts have been toward controlling behaviors, such as communication and navigation.

Quick images search:

https://www.google.com.hk/search?q=robotic+fish&tbm=isch&tbo=u&source=univ&sa=X&ved=0ahUKEwiB-sytwvPWAhUGn5QKHU9RCCAQ7AkIPw#imgrc=2wejvtIzecHSSM

 

http://www.robotic-fish.net/

Advances and uses outside of research:

Environmental sensing and exploration

https://www.sciencedaily.com/releases/2017/05/170510091553.htm

https://www.roboshoal.com/

https://www.designboom.com/technology/pollution-detecting-robotic-fish/

Many levels of fidelity with the Robo-fish:

https://www.wereblog.com/japanese-invents-life-like-robotic-fish

Robot fish in Maihama Tokyo Bay-

https://www.youtube.com/watch?v=wiK5fxV7ycI

Fish with lasers in South Korea

https://www.youtube.com/watch?v=-wBJPKiaxyg

Advancements at MIT

http://news.mit.edu/2014/soft-robotic-fish-moves-like-the-real-thing-0313

Robot fish toy

https://www.youtube.com/watch?v=31E8ywyUCrw

Robotics_concept_testing

For my final, I intend to create a proto-swarm of small robots. Ideally, the bots are to be as small as possible; perhaps smaller than 10cm^3  per bot.  Creating such a small “creatures” will prove to be very challenging–all while simulating insect behaviors.

To quickly summarize,: observers will be given to the opportunity to see the natural beauty that are swarms of insects. Via purposeful movements and group communication, the swarm is to explore a dark terrain which will then be mapped out( with the help of on-board smart LEDs) via long exposure photography-or “light paintings”. A keen eye might notice that the paintings will represent an image of some kind. Whether pre-determined images or live footage is to be determined at this stage.

After bouncing a few ideas around with IMA fellow Nick we came to the conclusion that an ATtiny board. We noted a few disadvantages: low power and low memory. Efficiency will be a major design factor!

Locomotion became another hot topic, as ATtiny boards are not capable of delivering enough power for wheels with high friction coefficients

The idea of having vibration motors was considered as a workaround, however, stability and controllability became a  major concern.

 

Rudi and I spoke about many issues these bots could have as well as many insights as to how to approach the bots as efficiently as possible. More on that soon.

as for locomotion.

Tiny DC motors became a worthy solution item -using their shafts to directly propel the bot.

On to testing.

Rough motor placements–


Actual test with super light chassis–this is also one the most annoying things I have ever made.

All seems to work, it is a matter of motor angles and wire connection. Further testing with different chassis are a must.

 

Pintbot, Winkbot, Kilobot and a few others are a few robots that have similar hardware. Most prominently the motor shaft to ground solution for locomotion.

Deliverable 4 (ss9952)

This is my pet snail, Jerry. Jerry will have a 3D printed shell that will include LED lights for aesthetic pleasure and as well because it will react to light with infrared sensors. Jerry will have 3 settings that are essential to keep him alive:

1.) night and day- LED lights light up when the environment is dark and turn off when lights are on so he knows when it is time to sleep. (Infrared sensor)

2.) touch sensor- positive response for being petted & negative response if he hasn’t been petted in 5 minutes. (motion sensor)

3.) feeding Jerry- when you put algae up to 20 centimeters away Jerry will get excited and if you don’t feed him every 10 minutes he gets mad. (distance sensor)

 

 

This is a really bad sketch but I envision the LED lights to be displayed this way.

 

This is the algae that will be detected by distance and color.

Robotics_D4

The Physical Stucture

 

Mirrors aside–for now.

Bugs are weird.

Robotic bugs would be also weird. but kinda neat.
Ideally, the bot would resemble an insect’s movements, physical appearance and possibly behaviors as much as possible.  Hence the long body and two primary motors for locomotion.

Robotics_A5

“From curious to clever, persistent to playful, he has personality x 10. He knows your name, face, and quirks. And best of all, he continues to evolve the more you hang out.”

Comzo a Pixar character in real life!

The tiny bot is designed to have as much personality as possible. Cozmo’s primary functions are environment recognition and facial recognition; in other words, Cozmo is actively learning how to recognize the world around it( computer vision). Apart from that, Cozmo seems to also learn via audio recognition and human-interactions as well as problem-solving- thus making it a surprisingly smart AI or at least that’s what the marketing team has to lead us to believe.

~45 iterations to create the final product

Product info video

Constant app updates to further Cozmo’s capabilities.

Lastest updated allowed users to program Comzo via a visual programming app.

So far Cozmo is marketed towards kids from ages 8-18 as most of its interaction/functions are game/play centered. But as the project continues to evolve it might become aimed toward adults or even professionals.

$170.00 usd

Thought: what if Cozmo was open-sourced?