Interaction Lab Final Project (Spring 2017)
Collaborator: Esther Liu
For the final project, I decided to make a nap box that comforts people and connects to the environment. Unlike the midterm project where we clearly knew what the final piece should look like at the very beginning of our ideation, our final project was absolutely not the case, we kept changing and developing our ideas while we were making things.
Originally we simply wanted to do real-time visualization of surroundings through a simple robot to address environmental concerns. However, it was beyond the course scope and our current skill set. So we started thinking what about creating something that is immersive and can connect the person with the environment? Meanwhile, we noticed that many of our friends are complaining about constant pressure, lack of sleep and concentration in the busy final weeks. After discussions with Moon and other friends, Esther and I finally decided to make a nap box that can not only enable them to temporarily get away from hustle and bustle and recharge themselves, but also arouse people’s awareness and appreciation of the environment they are in through real-time animation, which actually goes back to our initial idea.
- acrylic paint
- software: processing, Arduino, illustrator
Part I: Programming
We prepared two modes for the animated effect that the user can see and play with when they lie down and star at the iPad screen above. For the first mode (starMode), the user can create their own star field through joystick interaction, as well as the sensor values received from the environment, such as brightness and loudness. My first step was to create a simple star field without the involvement of any human and environment interaction. The star field part is a little bit tricky, because processing syntax is different from p5.Js, so I actually did some research online (Daniel Shiffman’s tutotials are so useful!), wrote and tested the star class in p5.Js first and then “translated” the codes into processing. I am glad that it worked!
The second step was to add different sensorvalues to the existing codes, so the star field does not just move randomly, but in accordance to the surrounding environments such as brightness and loudness, as well as joystick control. You can view the star field as a reflection of the environment around you, and you can intervene the visualization through joystick. To visualize the surrounding environments, processing sound and video libraries are necessary. It was not as challenging as I expected to study the libraries and get the volume and average brightness values from laptop mic and webcam. However, we encountered a problem on sending multiple values from Arduino to Processing. We asked Moon for help and solved the problem.
Moreover, we added a sound file to processing sketch to create a meditative feeling and make the sound visual effect more appealing. In a word, the average brightness of the environment controls the brightness of the stars, the loudness of the environment controls the speed of star movement, joystick controls the perspectives of the star field through 4 values (up, down, left, right).
For the second mode (randomMode), I got inspiration from the screen savers on our laptop. It has similar functionality with the starMode, but with different visuals. At first I did not know how to create the visual, the reference I found online helped a lot. In this mode, brightness controls the shape of the pattern, loudness controls the size, while joystick adjusts sound volume (up + down) and pattern opacity (left + right). You may also stop and resume the music by clicking the joystick. I spent a lot of time adjusting the parameters (rad, shap_ind) to improve the visual and make the sensor values fit into the control.
Here is a pattern sample look.
Switching modes is another small issue we faced. After I created the two modes, I found that even when I used different keys to call each mode function, two modes always add onto each other in display rather than replacement. Later I figured out I should not put the key detection function in draw loop, instead, I need to create a separate keyPressed function to reset the background and introduced a boolean value mode to control mode switch. Below is the correct code:
Part II: Physical Installation
We designed the nap box shape in Adobe illustrator and got five individual pieces through the laser cutter. After gluing them together, I covered it with sketch papers and used the acrylic paints to color the surface. I waited it to dry and added more layers to deepen the color. Finally I sprayed the box to prevent losing color. We also created a small box to contain Arduino and joystick, because we want to allow the user to interact with the visual effect through joystick.
This is the way to use our nap box!
The view inside the box~
PPT Slide: https://drive.google.com/open?id=0B_DiZA6WY5uOU2IxQUVqWm9uYmc
Code download: https://drive.google.com/open?id=0B_DiZA6WY5uOZHZERTREbHZVSlk
- serial communication (multiple sensor values, joystick control)
- processing video and sound libraries (brightness tracking, sound analysis)
- object-oriented programming (perin noise, array list, mapping, sin(), frameCount, etc…..)
I would like to give my sincerest gratitudes to Professor Moon for his guidance throughout the course and the inspirations he has given me. I am also grateful for Jiwon, Aven and all the IMA fellows for their generous support on each assignment. I have learned so much from this semester’s IMA courses, which really made my last semester at NYU Shanghai fulfilling and meaningful.