Kinetic Interfaces – Final Project (Francie)

In my final project I continue exploring the tile game that I have developed a basic prototype during the midterm (link to my midterm project documentation). Basically, the real-time image captured by webcam is divided evenly into 16 tiles, with one of them removed to be empty space. Players can move the tiles next to the empty space and exchange their positions thereby. The task of the game is to resume the disordered tiles into the original image.

In my final project, I made major improvements in the interaction of the game. After learning about the Kinect devices, I replaced the keyboard operations with kinetic body movements. Prof. Moon helped me create the swipe gesture based on the existing library. Basically, Kinect has a depth sensor and is good at detecting the closest point towards it. When we do swipe gestures, it is natural that we reach out our hands and arms in front of our body, which generates a closest point towards the Kinect device. As a result, the movements of the closest point are in sync with the swiping directions of our hands, which can be easily tracked by the changes of coordinates. In this way, the differences between the current frame and the previous frame will imply where the closest point is moving towards, and in other words they represent the movement of our swipe gestures.

When we were testing the swipe gesture, I came up with the concern of how to repeat the hand movements towards the same direction more than one time. To be more specific, if I want to move the tiles upward continuously, I will swipe my hand up, take it back, and swipe up again. However, the sensitive Kinect will record the tiny “take it back” and analyze it as a “swipe down” based on the coordinate value. Therefore, I need to set up a break between each movement, and let the computer know which one should be counted in. Prof. Moon quickly solved this problem by adding an interval between movements. The interval works like a timer, and it guarantees that the next movement will not be triggered until it counts from max to 0.

As for the randomization, I searched online and found an algorithm called Fisher Yates Shuffle. Here is the sample code. I did not understand how it implemented the randomization function. My friend Joe Shen helped me apply the structure to my tiles, and explained the basic knowledge to me. The two parts below designated the targets for the shuffle.

In order to start the shuffle, I also needed to put the following codes after the conditions. At first the shuffle would be generated at the beginning of the game, but later I decided to add more interaction to the game, and tried to shuffle the tiles when player stepped within a certain range of the depth. However, it was difficult to control the tracking of the closest point, and I then inserted the shuffle function to the swipe gesture. As long as the players kept moving upwards several times, the tiles would shuffle themselves randomly.

Here is the demo video that I took during the semester show.

My project still has a lot of potentials for development. As the guests mentioned in the final presentation, I could consider better user interfaces apart from the pure real-time images. It would be better if I could add animated effects to the movements of the tiles with lerp() functions. Besides, some of my classmates also pointed out the accuracy problem after they played the game, which required attention and major improvements.  With regard to the shuffle, I am thinking about a more logical movement to trigger it so that it fits more to our pre-existing habits and knowledge.

Leave a Reply