RAPS – BirthDeath

BirthDeath

Partners: Maxwell, Vivian

Project Abstract

A live audiovisual performance in collaboration with Maxwell and Vivian. This performance explores the limits of the human body.

Project Description

BirthDeath is about the limits of the human body, and pushing those limits even further, until the body collapses. Maxwell and I created a realtime audiovisual performance following this concept.

With this performance, our purpose is simply to gradually provoke an overwhelming feeling in the audience and even increase their heart rate as the overall rate and feel of our piece speeds up. This performance is not necessarily meant to make you reflect on anything, it is just a sensorial experience.

Our inspirations to create this piece came from our interest in working with dance performance and the body. Maxwell and I have both danced and we really wanted to explore with implementing the body into our audiovisual performance. We originally wanted to use a heartbeat monitor and an accelerometer to modify some values in our Max patch, but since this was too complicated for the little time we had, we decided not to use it. However, we kept the dance performance.

Perspective & Context

Our performances fits into the historical context of visual music and abstract film in the sense that we really wanted to create a correlation between the sound and the visuals although not all of the audio factors were modifying the visuals. Our communication during the performance was essential.

I think that nowadays, because of our constant need to maximize time and productivity, we push our bodies with the last bits of energy we have everyday. We forget that our bodies have a limit and act as though we are invincible.

Development & Technical Implementation

From the beginning, Maxwell and I had a clear idea that they were going to work on the audio and I would work on the visuals. Maxwell created the audio patch alone, but in my case, I found that working on my patch while Maxwell played around with the audio served as a guide for me to create the visuals.

Part of the inspiration for this piece came from Maxwell’s and my interest in using sensors in the performance. Thus, we also wanted to implement Arduino in our piece so that we could use sensors such as a heartbeat monitor and an accelerometer. We did research on different types of heartbeat monitors but the only ones available to us were not reliable at all and were quite complicated to use. We considered using different buying a nicer sensor, but still, we did not know how to use it and we did not have enough time to make it work. Thus, we decided to only have the accelerometer. With Eric’s help, we got a patch that could send Arduino data to Max, so getting the accelerometer values was not too hard. However, attaching the accelerometer to Maxwell with a bluetooth Arduino did not seem to be very reliable either, so we officially decided to leave our idea of using sensors.

Instead, since I still wanted to show a correlation between the heartbeat and the visuals, I made the amplitude of Maxwell’s audio determine the size of a 3D model of a heart, and the redness of the screen in the beginning of the piece. This is essentially what I wanted to use the sensors for anyways, but this was definitely a much better and faster way of going about it.

As to the rest of the visual components, I originally meant to generate all of the visuals in Max. However, since I do not really know how to create graphics in Max, I decided to use screen recordings of sketches that I had previously created in Processing. In the patch, I switched between 4 videos, one of a red background, another of the 3D model of the heart, and the two other ones for the videos of my Processing sketches. I used functions such as rotation, zoom, scramble, multiplier, etc. When it came to modifying the visuals live, the MidiMix was fundamental to the success of the piece. I cannot imagine having the same results without it. It really made all the values easier to access and to alter.

Overall, we had two different patches, one for the audio, and one for the visuals. This means that we used two different laptops in the performance, and they interacted through our own improvisation and through the amplitude of the sound.

Performance

The performance was the first time we all ran through the piece. It went way better than what we expected. I was terrified because I was not sure if it was going to go well and I did not know when the performance ended, so much of it was improvising on the spot and trying to make the visuals fit the sound and the dance.

In terms of what could go better, Maxwell kept walking in and out of stage to work on the audio, which we realized was not a great idea because it did not add anything to the piece and in fact was a bit distracting. So we decided to cut the dance off of the performance and only focus on the audio and visuals.

Even though Maxwell and I made the greatest contribution to the project, Vivian was very helpful during the performance because if we wanted to keep the dance, someone had to control the sound. So Vivian took that role.

Maxwell and I had the opportunity to performance in Miki’s show at Extra Time Cafe & Lounge. Here is a picture of us during the performance.

Conclusion

Overall I am very happy with how this project turned out. At first it seemed a bit chaotic because we did not really know in what direction to go. But we ended up figuring it out. Working with Maxwell was great, they did an amazing job with the sound, which really helped me develop my part of the project. And Vivian was very helpful during the performance because she was able to control the audio while Maxwell was dancing. I would have not been able to control the sound and the visuals at the same time. I really enjoyed doing this project and hope to create more live audiovisual performances in the future.

Being able to performance at Extra Time Cafe & Lounge was an amazing experience.

Leave a Reply