The guys at Laan Labs used the String Augmented Reality SDK to display real-time 3d video+audio recorded from the Kinect. Libfreenect from http://openkinect.org/ project was used for recording the data coming from the Kinect. A textured mesh was created from the calibrated depth+rgb data for each frame and played back in real-time. A simple depth cutoff allowed us isolate the person in the video from the walls and other objects.
Using the String SDK, we projected it back onto a printed image marker in the real world. We also experimented with actively removing the image marker from the scene using camera data from the areas surrounding the image marker.
SUPPORT FSM
Monero (XMR) | 43GnqUNJrTi9QyL7kEH8vM8pgWGCE6bjv1FSRipeNMM4TTeNnUVsRBb6MfMpQYxtLE7ReonxVVSXz2rFCEdW5H11LC3x73b |
Bitcoin (BTC) | 3PvaJPytg4pApTP5yCGpr62pRtudMgyfMQ |
Ethereum (ETH) | 0xd3c8677A4CfD9e8b4dFBb7720be2adb490Bd36b2 |