Visuals From Motion
Luka Tsulaia
The goal of this project is to build live interactive visual experience using body movement detection. To achieve this, I'm planning to use Microsoft Kinect Device and Unity3D. First I'm planning to implement the integration between MS Kinect SDK and Unity3D, including importing of silhouette and skeleton data streams. Further I'll implement visuals using textures, particle effects and/or lighting.
Planned Features:
- Importing human silhouette and skeleton data streams to Unity
- Implement masking of background and scrolling textures
- Implementing Hand and Face Detection
- Add particle effects to Hand and Face
- Adding music and Fast Fourier Transform beat detection.
- Gesture Recognition and wiring it to visuals
- Bonus: Overall polishing
Similar Implemented project: https://vimeo.com/221538677
Milestone 1 (2.03)
-Missed-
Milestone 2 (16.03)
- Setting up MS Kinect SDK and Unity Interaction
- Importing human silhouette and skeleton data streams to Unity
- Implement masking shader using UnityUI
- Adding textures to silhouettes and background
Results:
Demo Video #1: https://youtu.be/Xc91abtd-2U
As a first thing, I've decided to use approach this project by building loosely coupled components. For this purpose I'm using Dependency Injection framework for Unity3D, called Zenject . It will allow to keep the project flexible and stable as the number of project components rises.
In this iteration I implemented streaming of Body Skeleton and Body Index Data Streams from Kinect SDK to Unity3D. However I haven't yet decided how to implement Skeleton rendering. this will be one of the problems for next milestone.
Milestone 3 (30.03)
- Add visualization to Kinect Body Joints Data
- Polishing textures of backgrounds and adding scrolling animation
- Adding face and hand tracking
Results:
Demo Video #2: https://youtu.be/Ptdd4RQUVoI
I've implemented body joint visualization using simple 3D spheres. I've been also experimenting with testing lighting scene. I've started including progressive lightmapper, realtime lights and Unity Camera post processing stack asset. On this side final goal is to control lights with your hands.
Having body skeleton data it's easy to track hand joint positions, but I haven't worked on proper coordinate mapping between kinect, camera and unity world space yet.
Milestone 4 (13.04)
--Missed--
Milestone 5 (27.04)
--Missed--
Milestone 6 (11.04)
- Proper coordinate mapping implementation
- Continuing working with lighting scene
Added proper mapping from World Space to Camera Space. Added Skeleton rendering on Camera Space.
Milestone 7 (25.05)
- Adding Particle Effects
- Adding music and Fast Fourier Transform beat detection