Partner: Maudie Carey
For Maudie and I’s final project, we decided to do a continuation of our midterm. After receiving much helpful feedback from user testing for our final and learning new material in class, we decided we could really improve our midterm.
What Our Midterm Was and What We Wanted to Change
Our midterm was a flower meant for people to who wanted to take care of a plant, but didn’t want to face the consequences of possibly killing a real plant. In the midterm, we had the plant rotate based on light values it was receiving, while an animation of the sun was playing in the background. Additionally, we had the flower that was wilting raise when you “watered” it that also had an animation of rain playing based on moisture values.
A lot of the feedback we heard about the project was about how people couldn’t really see/focus on the animations because they were so engrossed with the physical reactions of the plant and how it would be nice if the animations were more realistic. With this feedback in mind, Maudie and I decide to make the following changes for the final:
- change our animations to make them more realistic and work better
- take out the physical changes of the flower so users could focus more on the animations
- somehow add wind/sound into the project
- make movable curtains
- project our animations
Sadly, we were unable to add the last two changes into our project but we were able to achieve the first three, which will be discussed in the rest of the documentation.
Arduino and Moisture Sensor
- moisture sensor
- cable wires
Because in the midterm I handled most of the moisture sensor and rain aspects of the project, I mostly just used code from the midterm. However, because I no longer needed a servo, I took out the code for that. So in terms of coding, the work was very simple.
But as I was working with the moisture sensor, I was reminded of how fickle/annoying it was to work with the values. One of the main issues I ran into in the midterm was how inconsistent the moisture values were because no one could tell how much water a user would spray on the plant, or how well I was able to dry the moisture sensor. I really wanted to fix this problem in the midterm, and Maudie helped come up with the solution which I’ll discuss later in the documentation.
For the work with Processing, my work was divided into two parts: fixing the water animation and creating an animation that related to sound and wind.
Originally, Maudie and I thought we could incorporate wind into the project by having an animation of a flower sway when the user blows into the microphone. After discussing the idea, we came across the question of “how are we going to make users intuitively know to blow into the microphone”. So, we changed out idea into blowing away a pesky bug. After researching, I found that white flies are harmful to plants because they suck on the plant and make the plant get sick. So, I decided to use a picture of white flies that would be blown away but would continuously come back.
I started working on the code for making the flies move. What I had set in mind was that the flies would come in from the edge of the screen, flying towards the center where the flower would be. Initially I was using an array to make multiple flies, but I didn’t know how I would make each individual fly be positioned and move in a specific way. During recitation, I asked Tristan for help and we had to use a lot of trigonometry to figure out the formula to get the flies to face the center.
After getting the flies to face the direction I wanted them to be, I needed help on getting each fly to move towards the center. I set up an appointment with Antonius but after explaining what I was doing, he told me that I was going about this idea in a much more complicated way. He suggested that I use objects and classes for my coding, and proceeded to do an example with me so that I could understand. After the appointment, I looked at the code that we did together and replaced the necessary changes so that it fit what I needed with sound and flies. Although the animation was different from what I had in mind, this animation was a lot more realistic to how flies behave (which is super gross). Below is a video of getting the flies to work.
Now that I finished the code for the flies, I sent them to Maudie because we had to place the flies on her screen since she was the sky.
The second part of what I had to do was fix the animation for the water sensor. Originally, I had an overhead view of the grass and raindrops fell down based on the moisture sensor values. Although I thought this animation was very pretty, I decided to scrap it altogether and replace it with an animation that looked like a cross-section of the flower pot in order to give it a much more realistic and immersive look. The coding for this was fairly simple since it was a lot of images. I took a picture of the pot and the table of the IMA desks so that what was on the screen reflected real life. I photoshopped some roots together and put that into Processing also. The only thing that was not an image was the soil in the pot, which got darker the wetter the roots got. I also decided to add a photo of rain that would appear on the roots when the user started watering the plant to give an immediate indication to the user that what they were doing was working.
At this point, Maudie and I had gotten the bulk of what we needed to do finished, so we got some people to test our project. One of our users was of course, Antonius, and he told us that it was really nice how users were able to control the environment. But one of his suggestions was to also get the environment to change based off the moisture sensor values on Maudie’s screen, and not just mine. So, with his advice, we got another moisture sensor for Maudie’s project and she wrote code to make it rain on the screen when users watered the plant.