openFrameworks

Project: FOOD FIGHT!

Food Fight Final Power Point Presentation

Concept/Inspirations:

My concept was inspired by my visit to the Shadow Garden and Sand Interactive Installation at the Sony Wonder Technology Lab, as I exited and caught a glimpse of the Cloudy With A Chance of Meatballs 2 advertisement, and the idea formulated of creating an interactive installation with food.  The Interactive installation at the Sony Tech Lab used a set up with a projector, camera and wall, to create an environment where people can experience catching the pixels of sand with their shadow.  The utilization of the audience’s connection with their shadow, I planned to translate that gestural connection into my project to produce interactive output from their gestural input.

In my conceptionI pursued 2-dimensional formed cartoon food because it appeared more delicious than real 3-dimensional food graphics, and it created nostalgia of my childhood associated with food fights.  I aimed to work with graphics that expressed this childish form while using a depth sensor to translate into a food fight simulation.  The food graphics were created using Adobe Photoshop and Illustrator.  The Kinect’s infrared sensor tracks the hand-throwing gesture, in order to engage the player in a space where the player can enjoy the simulation of throwing food that splats on a mirrored screen.  The interaction of Kinect’s tracking, and the mirroring of the player’s actions on the screen display produces a play space where the player connects his/herself through the replicated gestures of that person’s image on the screen.

My design goal was to to create a fun food fight experience for the player without creating the mess.

First Iteration:

In my first setup, initially conceived as a two player game with one player throwing food at another dodging the food, I created a mock up below of how the 2 players would interact in the experience, and came across obstacles of how the player dodging the food would be able to see the food that was being thrown by the thrower. The second player would have to face the wall in order to see the food projections that the first player threw at the second player. The players not facing each other seemed counterintuitive when the two players were supposed to be in versus mode, and face off each other, as depicted in side-view mock-up.  The other issue was the dodging player could not throw food at the one throwing at him/her, so both players could not enjoy the important aspect of throwing food at the each other, creating a dominance of one player attacking the other without any mode to empower the targeted player.

layout

Another complication was setting up the room with a different sensor for each player.  The thrower needed to be tracked using the Kinect, in order to detect their throwing motion, and the dodger needed his/her own sensor to track their shadow while the projector projected falling food and created the player’s shadow.  The set up in Fig. 4 demonstrates the achievement of tracking both players by setting up the sensors opposite of each other.  However, if either player were to walk into the tracking area of either of their sensors the play environment would be lost because it would create an error in the tracking.  This set up required a large set up, but created a limited play space.layout-side

 

Second Iteration:

With this conclusion, I redirected the focus of the project to the main factor I wanted to recreate with a simulation of a single player being able to throw food at a target.  This revelation lead to a new set up using openFrameworks, a Kinect, a laptop, and minimal length of a six feet space, as shown in below.  The experience reflects a mirror image of the player throwing food at a circular target.

layout-throw

 

Early PlayTech Testing:

At PlayTech, I tested a very early version when only the right handed throw was tracking and I had a target reposition to a random x and y position every time it was hit.  The kids provided good feedback for the play experience and made good suggestions, such as adding throwing capabilities from both hands.

edited editeds-shot

Technologies:

Adobe Photoshop & Illustrator – I used these platforms to draw .png’s of the food, splatters, and layouts.

openFrameworks, Kinect, & openNI – I used the openNI addon for openFrameworks, in order to simulate skeleton tracking using the Kinect.  The addon drew a skeleton frame and mask of the detected user.  Then I altered the code to track the throwing motion of the the right hand first to test it.  Reviewing the pseudo code of what I wanted tracked and what I did not want tracked, such as the reverse motion when the hand launches to throw.  I had to isolate the variables to track the previous and new position of the right hand, map the distance created by those 2 points, then set the threshold to track the speed of the throw and the direction to translate to the launched food object, which in the early stages was a circle.  Once throwing was coded, and the 2-dimensional circle would travel on the z-axis (depth) into the distance, I realized it did not create the perspective I had intended. I wanted the food to move towards the screen, since the player is mirrored in the display, so I reverse the axis it traveled on to create the effect that the circle was coming towards the screen.  A circular white target was then created for the player to throw at that was set to redraw after it was hit in a random area on the x and y axis of the play screen.  After those features were added, I continued with adding their suggested feature of the ability to throw food out of both hands.  With the threshold speed set to detect the forward throw, whichever hand throws at a faster rate, the food item will generate and project out of that hand.  The food item that generates is chosen from a random array that loads the different food items available.  When the food hits the target a corresponding splatter will generate based on the food thrown.  The splatter sound also plays every time the target hits, once the food hits, the splatter appears where the food hits the target, and the target moves to another location, but the splatter lingers for a few seconds before fading away to obstruct the player’s view.  The in the game version, the start screen and game over screens use buttons that the player needs to touch based on their x and y hand positions to activate the next screens.

 

Food Added:

Screen shot 2013-12-18 at 5.16.55 PM

 Green Figure Added:

The contrasting aesthetic of the masked figure against the cartoon food graphics and splats created a juxtaposing appearance.  In order to remove the figure, I needed to draw a new one based on the player’s body point positions.  Using polygon shapes in openFrameworks, I used those positions to draw a polygon-based stick figure with a oversized head to keep with the cartoon-like theme of the play experience.  Below the new play figure assumes a bright green polygon form, retaining neutrality from gender-based colors, and a non-gender specific body form to apply to my large audience range.

edited-green-figure

 

Final Presentation Set-up:

In my final presentation set-up below, I connected my computer to a television screen, providing a larger screen for my player to enjoy their food fight experience. The Kinect sensor is in placed in front of the television, and the computer is running openFrameworks to the side of the television cart.  This set up allows the player to have a better sense of existing in this food-throwing environment.

DSC_2086

Screen shot 2013-12-18 at 6.29.42 PM copyScreen shot 2013-12-18 at 6.29.55 PM copy Screen shot 2013-12-18 at 6.30.00 PM copyScreen shot 2013-12-18 at 6.30.22 PM copy

Project Links:

Game Play Experience (w/start & game over) – https://github.com/huynj316/foodFight-1.git

Play Experience only – https://github.com/huynj316/foodFight-MS.git

Research paper  – Zipped Research Paper

Personal website – http://www.gink-arts.com/food-fight-final/

Final Videos:

Screen recording:

Live Demo:

Future Aspirations:

Prospectively, I would like to complete a fuller virtual play experience with a cafeteria background and children chattering in the background noise.  The circular target will become another player, which will be tracked by their head position with the Kinect sensor sending position points to and from another Kinect, through 2 computers and a Spacebrew connection.  The Spacebrew connection will send the points of the head position to each player’s openFrameworks sketch.  Through this set-up both players can enjoy the play experience with their code running on openFrameworks and Spacebrew.  Their screens will have live feed of each other’s head position so they can throw at each other and dodge.  The polygonal forms of the figures will be more refined to match the style of the cartoon food and facial expressions will be added for hits and misses.

Leave a Reply