This week I combined what we’ve learned – particle system and spring and created a skittles fountain. The colorful particles in the inclass example really reminds me of some of the skittles ads I’ve seen.
I defined an array containing the five colors of skittles and use random() to assign color to newly created particles. Also I set ellipses to be of the same width but different heights so it kind of created a feeling of 3D skittles. The particle source is at the draggable end of the spring.
Instead of lifespan, disappearance depends on the position of each particle. Particles fall below the certain height will disappear. Also I drew a tray moving along with the mouse. Particles with a positive speed on y-axis (i.e. falling particles but not ascending ones) will disappear once they touch the tray.
I tried to create an environment in which particles will merge into others when meeting them or split into two small particles otherwise. I found it much more complicated and difficult than I expected it would be.
I divided my task into three parts: merging, dividing and the combination of previous two. Coding this involves a lot of array modification and some (very confusing) knowledge about “call by reference/call by value” which drove me mad during the past few days.
So what I have completed is shown as below:
I finished the merging part. The number printed in the console indicates how many particles are left on the screen.
Later I talked with Moon and he showed me the game Osmos and this game is similar to what I have done to some extent. So I did a simple version of Osmos. Basically there is a bigger yellow particle in the center of the canvas as the main character to be controlled by the player. All other existing particles are smaller and given an initial velocity to move. Player can click around the bigger particle to make it release very small particles and then drive it forward in the opposite direction of the cursor.
I added a little bit friction by multiplying velocity by 0.99, but it seems that smaller particles will nearly stop moving shortly. So I wrote a function called resetVel() to check smaller particles and reassign velocity to them once their velocity become too slow.
In this sketch the main particle, though share same kind of properties with other small ones, is different and more significant than others. So I separate its operations from the for loop (speaking of for loop, I learned from Moon that when using splice() it is better to loop reversely in order not to cause index out of bound error).
Ripples – millis() to control the radius of the ripple; sin() to control the opacity of the lighter bluish green color of ellipses to create the pattern
Fishfood attraction – lerp() to pull the fish; p5.Vector rotate() to alter the direction of velocity
Pebbles – vector & force; lerp() to decelerate to the average velocity
I implemented the three features separately so it also took some time and effort to combine them into one sketch. To make clear the logic flow among different parts is important and a little bit tricky. Apart from that, I used mouseClicked() in all three files but I had to differentiate them and reassign unique triggers to them. At first I wanted to trigger fishfood by left click and pebbles by right click using “mouseButton” and if statement, but later it turned out that right click wouldn’t work unless I replace “function mouseClicked()” with “if (mouseIsPressed)”. However, the latter one would keep triggering ripples and executing following code until mouse button is released. So I added keyPressed() function to switch between modes and used only left click. (You can see detailed code attached.)
Further improvements may include interactions among fish and creating patterns/shapes using fish school. I think it will be interesting if some weak forces exist between fish and fish. In addition, applying different alpha value to fish can help create a more realistic layer effect and expand the fishpond from 2D to 3D.
This week Kefan and I made a short pixilation animation. We borrowed the idea of flying on broom and added a magic rug apart from the magic broom. Although we turned on continuous shooting mode, it was still very difficult to catch the highest moments. Also, as I kept moving in the hallway, the camera needed to be moved forward constantly, so the scope and lighting changed a lot from photo to photo, which is a very tricky problem to handle in postproduction. We left out many photos that still couldn’t fit in our animation well after some modifications, though some of them are really interesting to watch 🙁 I used FCP to convert photos into a video with certain frame rate and I think FCP wins out over Photoshop and Premiere.
My idea changed several times during the very first week of final performance preparation. I wanted to create a ukulele on a rug decorated as the fingerboard of ukulele and have sensors under the rug so that one can trigger sounds by stepping on sensors as if the person was really playing a ukulele. Antonius then suggested I use Kinect for this project. But later in class I learnt that Kinect is difficult to use and I cannot really count on its accuracy of detecting my position in a grid. Then I changed my idea and decided to create an instrument based on interactions among people. I stand in the center and touch my performers surrounding me to generate different sounds. After consideration I finalized my idea — a skin-to-skin choir.
I implemented my idea with Makey Makey. Makey makey board is connected to my laptop, and wires go from makey makey pins to every performer. I, as the conductor of the choir, connected to GND. I have to say that Makey makey is very friendly to beginners and easy to get going.
(Playing around with makey makey, a very simple graphite piano)
Instead of Processing, I used keyboard piano in GarageBand and chose Chamber Choir as timbre.
I also redid key mapping of the Makey makey board to adapt it to keyboard piano in garageBand.
I had six performers, and six long wires (about 3 meters long) were really hard to organize. In order to make my project less messy, I used two strands of three twisted wires. I tried to tape the wire on my wrist, but later found it not very flexible and kind of constrained my movement, so I bent the wire into a ring and it seemed to work better.
I feel that rehearsals were really important for my final performance and helped a lot in improving my instrument.
During the first few rehearsals I figured out how the wires should be set up on the ground.
At the same time some bugs popped up. For example, Miki was one of my performers. She could trigger the sound herself without touching me. We fixed this problem temporarily with a plastic box.
Later we found that it happened to other performers sometimes as well. To solve it completely, I untwisted the wires more and organized them more neatly. After several trials, this problem was solved.
To ensure that the circuit won’t go wrong and get messy, I added a breadboard and put it together with Makey makey into a small box.
This week we were required to make an instrument that works with sampled sounds or live sound.
I downloaded several sound clips of typical construction tools including hammer, shovel and pipes from 100audio.com. Then I cut some of them into very short clips that contain only one beat in Audacity. For background music I chose the sound of rubbing sandpaper. I cut out part of the original video and made it into a 4-beat clip so it is easier to keep it looping as the background.
For Processing, I created a simple sketch that plays sampled sounds and shows visuals of the sounds when certain key is pressed. I planned to trigger sound with a tilt switch so I added Arduino part to my project. I used Serial communication to connect Arduino and Processing together. I kind of forgot how to write Arduino code so I made reference to the “Button” example.
In addition to that, I wanted to add live input as well, so I referred to Processing vocoder example. This synthesis sound effect worked well when I tested with headphone as output, however, it was totally messed up during the actual performance. The sound feedback from the speaker would go back to the mic again and ruin everything. I had to delete that block of code in order to continue my performance with sampled sounds.
As for visuals, I made them appear only when the corresponding key is pressed. After my performance it was suggested that I let visuals display longer so that audience can see it more clearly, so I modified my code a little bit using millis() function and made each visual appear for 500ms.
I built a very simple tilt switch circuit and taped the tilt switch to a stick so the hammer sound will be triggered every time I strike it against the table.