kinect user skeleton with java test 1

This is my first test with kinect user skeleton tracking using java made for a game that we’re making  for Let’s Make inauguration.

I took the coordinates of the head joint and I used them to move the background in order to recreate a parallax effect.
With hip and head joints I check if the user jumps, lowers, moves left or moves right.
When user’s movement is captured, then this software writes the movement on window.
Yes programming with kinect makes me sweat :D

FACIAL PERFORMANCE CAPTURE TEST 2

This is my last facial performance capture test.

In the previous test I sent the x,y position of each markers inside the frame (720p) and I assigned these position values directly to the cubes. Now I send the offset vector of each marker calculated by subtracting the rest position from the new position.

In these new software’s version I created an interface to tell maya how many and which markers i’m using in the current capture session.

marker_interface

Facial Performance capture test 1

I tried to recreate weta’s system for facial performance capture using an helmet camera.

rise-planet-apes-motion-capture-andy-serkis

I wrote a simple software in java, based on color blob tracking concepts , that searchs the markers  placed on the face and sends their x,y position to maya. These informations are sent with udp protocol, that is faster than tcp/ip because there is no control of the packets.That’s why it’s used for streaming.

The hardware is an helmet camera made by me, using a skater helmet and a hd webcam.

helmet_cameraThe markers are the blue pieces of scotch tape, for now, but i’m creating a version with blue make up pencil  (stolen from my girlfriend :D ) and 3d head in place of cubes.

 

 

creative coder