NOC – Week 1 – Bouncing Ball – Elaine

This is my first assignment during the semester, and I did not have this account back then…

For this project, I tried to create a bouncing ball that in some sense controlled by the user. Click up, down, left, and right arrow keys to control the ball. Click space key to stop the movement.

The project demo is here:

https://elaineang.github.io/NatureOfCode/Assignment1/

Completer Source code could be found here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Assignment1

 

NOC – Final Project – Your NYUSH – Elaine

For my final project, I made an interactive drawing panel for users to drag and create whatever shape they want with certain constraint. This project is meant to express an idea that with limit resources, we could create infinite possibilities. The project was inspired by a hiring site from Google Creative Lab (www.creativelab5.com). Check the below link for my project demo:

https://elaineang.github.io/NatureOfCode/Final/

Two basic operations:

  1. Click onto the end point of “N Y U S H”, and after you see the control points (showing as a little rectangle), you could drag these 5 five letters into any shape.
  2. Click onto any of the flying torch and drag them. The torch will not move when you click onto them. If you want them to move again, mouse hover onto that torch, and type “d”.

The complete source code is on my GitHub. Check it out here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Final

(Check the git log to see my developing process if you are interested 🙂 There were some weird experiments going on…)

The most interesting part for me in this project is to provide my user a way of controlling different objects on the screen, while leave them with maximum flexibility to create their ideal “NYUSH”. I’ll briefly explain how I manage the letter and the flying torch interaction.

I’ll separate my work into “letter” and “torch” part.

Letters:

For the letters “N, Y, U, S”, each of them are controlled by 4 control points, and for “H”, it’s controlled by 6 control points. These control points are statically defined during the setup process to make the letters appear normally on the screen. The below screenshot explains the idea of control points.

WechatIMG1WechatIMG2     WechatIMG3

WechatIMG4           WechatIMG5

  • Create the control-drag effect:

In the main drawing loop, I keep calculating my current mouse cursor and it’s distance with all the control points. If the distance is less than a certain value for a certain point, I get which control point that is and which letter it belongs to, marked the letter as under control, and if the user clicked onto the point at the same time, control points for that letter shows up. For “N, Y, H”, it’s simply connecting control points with lines (actually, patterns that forms a line), and for “U, S”, it’s Bezier curve constructed with 4 control points using p5’s build in Bezier curve function. When a mouseDrag() event occurs (meaning that when I dragged a certain control point) I keep tracking of my mouse position and updating the value of that point.

  • Create the (ellipse or other filling) shape out of these control points:

We focus on filling out a pattern between two control points. We had a start point and an end point. Then, by representing these two points in vectors and subtracting these two points, we had a direction. We passed in a ‘step’ parameter to define how many repetitive patterns we want between two end points, adding some small adjustment regarding the length and width of the pattern, and we have our fill pattern function below:

After having such function, it’s fairly straightforward to apply it into N, Y, H since these are just all straight lines. It’s a littler tricker for U and S. Bezier curve in p5 allows you to define bezier points that, given a start point, an end point, and how many points you want on the curve between these terminal points, generate points on the curve. I take use of these and defined some points on the curve, and fill each pair of these ‘terminal points’ in the same way as fill a straight line.

Torch:

For the torch, I loaded a pre-processed image with a specific width and height. Dragging the torch for moving around is similar as dragging the control points on letters. When I implemented dragging, I though it might also be nice to implement a ‘free’ that allows the still torch to move again. I tried to implement it with re-clicking on the same torch (so that the first click will stop it and the second will make it move). But it turns out that there might be mouse event conflicts in this implementation. I gave up trying to define the correct mouse event but went for an easier approach: using keyboard.

Similarly, mouse hovering the torch will mark it under control and waiting for the “d” key to be typed. These torch are moving in a flocking system so the direction are totally random and based on their environment.

An interesting piece that I like is that I put 20 torch on the screen which are divided up into 5 groups. So that if you find the correct 4 torch of the same size and positioning them well, you can create a NYUSH logo.

Background:

For the background adjustment, I adopt the HSL color mode and let the user to adjust the H value. The code for background is as below:

As moon suggested, it might be fun if I could eventually create a multi-player drawing panel with some server side code that stores and pushes the state of each user. This will be the future direction that I headed towards. Let me know if you have other cool ideas. 😀

Network Everything Final Project – Intrudead – Ale, Olesia, Elaine

Our Original Idea

After learning that someone took equipment from our class cart without our instructor’s permission we decided to prevent this from happening again. Our solution was to create an alarm system that will tell our professor when someone has taken something from our class equipment without his permission. We imagined that we could do this by sending an email to our professor with the picture of the person who’s stolen our equipment. In addition to that, we would discourage anyone to steal from our class cart by uploading a picture of the people who have stolen from us to a public website. We dubbed this website the “Hall of Shame”.

 

Our Final Product

I guess we have accomplished everything we wanted. As for now, we have a fully set-up system that detects intruders. So, let me explain how our project works. Essentially, we built a light sensor circuit, the light sensor is in the state of constant darkness. If an intruder opens the box or the cart, the light sensor gets exposed to the light and this mechanism triggers several things: first, a picture of the intruder is being taken, it is being sent to Scott’s email, it is being uploaded to the website called “Hall of Shame” (publicly available at here)  which stores all the photos of all intruders. Moreover, we also set up an RFID system and assigned a particular RFID tag to Scott. Basically, if Scott is the one who opens the cart then there is no need to send an email and post the photo to the website, hence, Scott can swipe his own tag which will cancel the email sending and posting pictures to the blog. However, you have to hold the tag for 10 seconds in order to cancel the email.

The source code is available at https://github.com/ElaineAng/Final-Project-Net. Email us if you have access issues.

Here are two short videos recording how we expect it to work:

Here is a screenshot of some of our “Intruders” appearing in the email attachment.

intruderPic

Technical Difficulties

First, the RFID tag would not work all the time. It took us may days to figure out why the RFID system stopped working even though none of the code had been changed. Once we figured it out, we had to connect the RFID tag system to the main processing code via Bluetooth. There was no major hindrance with that. It was just the matter of time. However, we lost the initial tag that had been assigned to Scott, so we had to replace it with another and go through the whole setup process again. Thanks to the amazing  Arduino Atmega board that provides multiple serial communication interfaces, we are able to receive information from the RFID reader and send information to the processing sketch via bluetooth using one board.

 

Setting up the email system was also a challenge for us. At first, we tried using Temboo to send e-mails. Although we could successfully send emails with Temboo and Processing, we weren’t able to send picture attachments as it asked us for a URL and couldn’t attach pictures that were stored locally. Our second attempt at fixing this issue was to use Processing with the Javax mail library. However, this didn’t work out as the library is deprecated and the example code dated from 2007. Finally, we used Node and Processing to solve this issue. We basically ran a local server that runs an email script every time someone tries to access a specific page. Processing becomes the client by making a GET request to the page that triggers the Nodemailer script to send an email with a picture attached to it.

 

Posting the image to a website after it’s being taken is a bit challenge as well. Since we need to arrange for the processing sketch to post the picture in real time, and it was a new experience for each of us. We did not find any suitable APIs or image hosting sites that does this for us, so we decided to write our own script. We find two open source php script that does similar things and put them on a web server, which is configured to be accessed via a domain name. The index.php scripts handles reading a directory and displaying the image, the upload-n-save.php scripts handles image posting and saving them locally on the server.

 

What we would want to change

The only change we would  want to implement is to upgrade the way we use external cameras (e.g. use an external web-cam or using a serial camera attached  to a Raspberry Pi) to increase the quality of the photos taken because sometimes the embedded web-cam would take pictures of humans parts’ of the body or the box. This would improve the content of the website as well as help Scott to identify the intruders and punish them. I guess this is the only change that would make a difference.

NOC – Week 11 – Flocking – Elaine

For this assignment, I created a scenery where you see through the darkness via a lightening circle…

Checkout the demo here, and try dragging the ellipse:

https://elaineang.github.io/NatureOfCode/Assignment9/

There are always 50 triangles within the circle that forms a flocking system, and if your mouse cursor is within the circle, these triangles are pulled towards the mouse.

The source code is here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Assignment9

Inspiring networked alarm clock – Elaine

http://www.dailymail.co.uk/sciencetech/article-3316904/Waking-SMACK-dawn-Alarm-clock-slaps-face-bed.html

This is a classic Arduino example, and it’s like a mini Goldberg machine. So according to this article, the inventor built connections between the clock, the Arduino microcontroller and a relay, housed in a tin, so that when the alarm clock goes off, it triggers a motor. The motor, which is connected to the arm, sends it whirling around…

I like this simply because my phone’s alarm never wake me up, and this project 1) makes me laugh so hard; 2) gives a promising solution for waking me up.

NOC – Week 10 – Seaweed with flow – Elaine

Check out the Demo here:

https://elaineang.github.io/NatureOfCode/Assignment8/

Click on the screen to generate a seaweed. It will stop growing after a certain time period.

The source code is here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Assignment8

For future improvement, I want to have the shape the seaweed under better control…

NOC – Week 8 – Spring – Elaine

So for this week, based on the spring example we did on Friday, I modified a bit of how dragging works, added a minimum and a maximum amount of spring shape change, and created an actual zig-zag spring shape… Something like this (try dragging the ball :D):

Demo: https://elaineang.github.io/NatureOfCode/Assignment7/

The source code is here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Assignment7

I think it would be nice if I could draw a spring shape using sin waves instead of lines, but I failed…

Main part of the zigzag algorithm (let me know if you have better ways of implementing this):

On second thought, calculating the spring rotation angle and using vector rotate might be a better approach…

Network Infrastructure in City and on Boat – Elaine

I went to Hong Kong and Vietnam during spring break, and there are two types of network infrastructure that I noticed. One supports the Internet connection for a city, which is usually via physical wires, cables, and wifi routers; and the other one supports the internet connection on a boat sailing on a sea, which is isolated from the land and is not connected with any physical cable.

I found that the network infrastructures in cities are similar across different cities. For example, there probably aren’t many differences between the infrastructure in Hong Kong and that in Shanghai. Physical cables for these are somewhat harder to locate partially because most of them are buried underground, in walls, or covered by some other materials. Those exposed outside are easily confusedI spotted something in the hostel I lived in but I am not sure if they are used to provide internet connection. There are lots of public wifi signs in the subway station, but the physical routers for these are harder to spot.

In the hostel. RFID reader embedded in the door:

hotel0

Ethernet Port on the wall:

hotel3

Wires:

wire0 hostel2 hostel1wire1

Public wifi sign:

freewifi3 freewifi2 freewifi

Antennas on boats:

ship0 ship1 ship3 ship2

The wifi provided on the boat are not very reliable. A little digging of the underneath techniques gives me some insights about the reason: the wifi provided on boat normally relies on satellites, which are prone to unreliable connection due to some blocks between the antenna and the satellite (mountain, buildings, etc). Also, when the ship changes direction quickly, it is possible for the signal to be temporarily lost.

 

 

NOC – Midterm – Fancy Waves – Elaine

Midterm Project demo:  elaineang.github.io/NatureOfCode/Midterm/index.html

Try Pressing up, down, left, and right keys to control the movement of the wave.

Press space key to break the wave into particles, and press space key again to put them back together.

Press “r”, “g”, “b” to change the color of the wave.

Check debug mode, and increase the radius to see how the wave is constructed. The radius

The source code repository is here:

https://github.com/ElaineAng/NatureOfCode/tree/master/Midterm

Here is a video demonstration of how it could look like:

The idea of simulating a wave movement comes from a sine wave (that follows the movement of the mouse) demo in class. Initially, I only had the basic sin wave shape (see here), but after I accidentally changed the horizontal speed background refresh to leave a trace, I found the result interestingly charming. So I modified a little bit and created an eye-tricking 3D wave.The wave is constructed with a line of particles, each of them moving only vertically and bound by a vertical force (let’s denote it by Fp) pulling itself to the equilibrium position. The initial position of each particle is determined by (x, sin(x)). When you press up or down, the amplitude of the wave is changed by changing the vertical velocity of each particle. When pressing left or right, the x speed gets reversed.

The total number of particles on the canvas is determined by the radius of each particle. The x position of each particle is carefully calculated based on #particles so that when it moves horizontally, it always fits into the canvas. When the x position of a particle becomes greater than the width of the canvas, it gets re-adjusted using a modular function.

An interesting piece is transitioning from the wave mode to the bouncing ball mode and the transition back. In the wave mode, when you press Space key, each particle will suddenly loose Fp and its current speed. At the same time, each will be given a random y-axis speed from -12 to 12, and start falling from its current position with this initial speed. I stored each particle’s position before the transition from the wave to the bouncing ball, so that when I do the transition from the bouncing ball to the wave, I could use the lerp function and created a nice trace effect.

I backup each particle’s position and x speed before the transition from the wave to the bouncing ball, so that when I do the transition from the bouncing ball to the wave, I could use the lerp function and created a nice trace effect. When each particle moves close enough (<0.01 in my case) to its backup position, it gets reset directly to the backup position and start the wave effect again.

The color (red, green, or blue) maps the particles initial position to an RGB value, in order to make the effect of a gradual changing color.

In the debug mode, when we increase the radius of each particle, we could spot that there are still some bugs with the position during the x-axis transition from width-of-canvas to 0. Two balls overlap partially. If observe carefully, this will influence a little bit of the shape of the wave at normal (not debug) mode. Unfortunately, I am still unable to fix this part. Working on it…

Some future improvements could including different amplitude on different parts of the wave, making amplitude change with the mouse movement, adding sound and other fancier effect… Let me know if you have more creative ideas. 🙂