This is a game developed with Java, arduino and real fruits with Simone Carcone and Giulia Dell’armi: Let’s Make guys. We have created this for Let’s Make event.
This is a game developed with Java, kinect and 3D Maya with Simone Carcone, Giulia Dell’armi and Adriano Muraca: Let’s Make guys. We have created this for Let’s Make innauguration to entertain people who come.
This is my first test with kinect user skeleton tracking using java made for a game that we’re making for Let’s Make inauguration.
I took the coordinates of the head joint and I used them to move the background in order to recreate a parallax effect.
With hip and head joints I check if the user jumps, lowers, moves left or moves right.
When user’s movement is captured, then this software writes the movement on window.
Yes programming with kinect makes me sweat
This is my last facial performance capture test.
In the previous test I sent the x,y position of each markers inside the frame (720p) and I assigned these position values directly to the cubes. Now I send the offset vector of each marker calculated by subtracting the rest position from the new position.
In these new software’s version I created an interface to tell maya how many and which markers i’m using in the current capture session.
I tried to recreate weta’s system for facial performance capture using an helmet camera.
I wrote a simple software in java, based on color blob tracking concepts , that searchs the markers placed on the face and sends their x,y position to maya. These informations are sent with udp protocol, that is faster than tcp/ip because there is no control of the packets.That’s why it’s used for streaming.
The hardware is an helmet camera made by me, using a skater helmet and a hd webcam.
It’s in a simple simple version because I wrote in short time, but I’m happy anyway.
The next step is connect with Kinect and get z-depth in gray-scale instead of white or black.
little problem…I don’t have kinect sigh