4) Barco is a 2D hand drawn animation that blends in the 3D world as well. It is a story about discovery and about encountering something that you have never encountered before.
5) Ben Lei Sound Design
7) My story for animation was first inspired by a book that i was reading called sapiens, It’s a book about human evolution, I was fascinated by thinking of the moments that humans learned or realized that doing things differently could bring new discoveries. For example the first person who ever thought about floating something on water and therefore being able to travel on water, I was planning on creating a short about this moment how I imagined it would be like for the first ever homo sapien. My plan ended up shifting at the end and i took it more bear bones and made it abstract a discovery among two worlds the 2D and 3D mediums representation of the discovery between man and water.
8) The technique used to make this animation was hand drawn frame by frame animation with pencil and then scanned into after effects. Also an eraser might have been used lol. After effects was a huge player and i used trapcode form to create the 3D assets and particles.
9) Learned so much but the best thing was learning how to take something that exist in our world like a pencil drawing and giving it life, i had only previously done 3D animations before but never more “Artistic” explorations like pencil animation in 2D.
Awake Project Documentation
The project “Awake” is an experimental virtual reality experience, that consists of the usage of the HTC vive Headset and the body of the user as a controller, this was achieved by using biological sensors and the arduino, which is an open source hardware platform. The goal behind this was to create an experience where mind and body where both being attended to in the virtual world, Guided meditations and biological visualizations where two instruments that were used to achieve this goal. With the advancement of technology everyone talks about how this takes us away from being closer to others and experiencing life, this has been a huge topic of controversy over the past 10 years and I want to argue with this project that technology can actually be used to enhance our experience.
Creating a virtual reality experience in which the user can use his body in order to affect the objects in his surroundings and thus visualizing his levels of calmness or concentration, while being guided through meditations and mindfulness exercises.I was inspired by my journey through my 4 years in college where i dedicated much of my time to technology but at the same time was very close to the spiritual side of things and took some experimental mind and body classes in new york city. This was a way of bringing those two things together, allowing technology to ultimately help my spiritual guidance. Users of this experience i predicted would be able to even if for just a while take their minds off of their daily loops.
At the beginning when i was researching to put together the hardware and software that i needed for this project I ran into some research by MIT media lab in which they were using a EEG scanner which is sensor that detects electrical patterns in your brain and transforms that into data. When i first saw this i flipped out and thought what the majority of people would think, Holy crap i can use my brain to control things in a computer. But like almost always sometimes things seem too good to be true and when i did some further research in Emotiv which is the name of the company that produces these EEG scanners I found out that things were not as they seemed. The computers were not “reading your mind” they were in fact trying to identify similar patterns of electrical impulses when you thought about something specific. For example you could raise a concentration parameter by thinking about concentration but not actually concentrating. So this was the thing, It was not really reading if i was concentrated or not it was just reading if i was thinking about it. So my initial dreams of recording levels of concentration and Calmness were destroyed. The idea was to have the subject in a room in the middle of the galaxy and that through his levels of concentrations could allow himself to get rid of all physical things in the VR world and start floating through the galaxy.
What actually went down
So after my dreams were destroyed by what at this point i consider “Fake” brain scanners I starting thinking about ideas to replace this that were closely linked to concentration, and then i thoughT HEARTBEAT! So i started investigating what were the possibilities of using your heartbeat in order to modify the virtual world inside of VR. So i finally found a sensor that could track your heartbeats and that was compatible with the arduino Uno which was perfect. So the plan was to use the heartrate of the users in order to effect certain objects in the virtual world but I found that piping in information from the arduino to unity was not as simple as I thought, so it transformed into more of a heartbeat visualizer inside of VR. The rest of the experience was enhanced by guided meditations and virtual interactions in the space room.
For my photogrammetry project i decided to team up with marjorie and help each other out trying to reproduce our own heads! it was much more difficult than we thought it would be because with photogrammetry you have to tke pictures of the object in the exact same place each time in order to accurately read whatever it is that you are scanning. When it come to our heads it was very difficult to stay in a fixed position for enough time to get a full set of pictures around the whole surface of the head. Therefore we did not get the most accurate readings and our heads look pretty crazy! i tried to compensate the areas that where full of holes with another model of a brain inside my head. This skill is AWESOME and i definitely think it’s an amazing powerful tool for anyone that uses 3D in any type of work, the only setback is the price of the software and the amount of hardware it requires to get a design to render apart from that amazing tech!
My blippar intervention was purely for fun there was no big intention behind it, but i learned a lot doing it specially the fact that it’s so easy to do something like this and can be so effective in really impressing others for various reasons from arts to advertisement it is such an amazing tool to have in your arsenal and definitely one that i will be using in the coming years of my life. Awesome app!! here is some pics of the animation that I made that i put on blippar.
For our pixilation, we decided to incorporate our strengths-3D modeling and 3D animation-into the assignment, integrating a pixilated figure with a created 3D world. We played around with different 3D models, abstract shapes, and environments, and settled on portraying meditation, depicting the space of the mind as composited behind the meditator.
We used the green screen in the Emerging Media Lab to capture stop motion images of Marjorie. To achieve an illusion of levitation, we covered a tall chair with the green screen fabric and chroma keyed out the green in After Effects. The model of the two rings was created in Blender, which was then animated in After Effects and added to the scene, effectively compositing the 3D environment with the pixilated live action.
This animation was really fun to create because i merged what i have been learning in my traditional chinese painting class and my animation course. I decided to use the classic character “Ren” which means person in chinese and have the character do the walk cycle because of his anthropomorphic form.
As for my research i have been sincerely freaking out because of the fact that i have not written anything in around 2 years. Despite this I have finally organized myself and talked to sakar and i am much more calm about it. I plan to research the effect of sound separating it into 3 different sections noise, low frequencies and high frequencies and the effect that each respective type of sound has on the physiological and psychological aspects of the self. I have found a good pool of previous investigations on this topic and experiments that have been performed on animals using sound. I think this is important because we are often not very conscious of the full effect that sound has in our daily life and our well being.
As for my interactive project i am much more calm than my research paper, although i find that the funding is keeping me from advancing. It pretty ridiculous that they are waiting until 2 weeks before the project is due to approve the budgets, apart from that i have already moved forward to buying my sensors. These sensors are heart rate monitors that are plug and play using arduino and processing and i have got them working and displaying actual data in the arduino serial port. I am now working on the communication between arduino and Unity, the Gaming software im using for setting up my 3D scene, this will allow me to affect certain scripts in unity in order to create the interactivity between the body and the tech. I am waiting for the capstone funding in order to buy the necessary asset packages that i need for the construction of my 3d scene inside of unity and the custom skybox i need to create the void.