Contents Preface. vii 1. What.Is.the.Kinect?. 1 How Does It Work? Where Did It Come From? . 1 Kinect Artists .10 2.Working.with.the.Depth.Image. 43 Images and Pixels .44 Project 1: Installing the SimpleOpenNI Processing Library .45 Project 2: Your First Kinect Program .53 Project 3: Looking at a Pixel .61 Converting to Real-World Distances .67 Project 4: A Wireless Tape Measure .69 Project 5: Tracking the Nearest Object .76 Projects .84 Project 6: Invisible Pencil .86 Project 7: Minority Report Photos .96 Exercises . 107 3.Working.with.Point.Clouds. 109 What You’ll Learn in This Chapter . 110 Welcome to the Third Dimension . 111 Drawing Our First Point Cloud . 113 Making the Point Cloud Move . 119 Viewing the Point Cloud in Color . 126 Making the Point Cloud Interactive . 129 Projects . 139 Project 8: Air Drum Kit . 139 Project 9: Virtual Kinect . 157 Conclusion . 182 v.
4.Working.with.the.Skeleton.Data. 185 A Note About Calibration . 192 Stages in the Calibration Process . 193 User Detection . 194 Accessing Joint Positions . 202 Skeleton Anatomy Lesson . 209 Measuring the Distance Between Two Joints . 219 Transferring Orientation in 3D . 228 Background Removal, User Pixels, and the Scene Map . 246 Tracking Without Calibration: Hand Tracking and Center of Mass . 254 Projects . 263 Project 10: Exercise Measurement . 265 Project 11: “Stayin’ Alive”: Dance Move Triggers MP3 . 279 Conclusion . 299 5.Scanning.for.Fabrication. 301 Intro to Modelbuilder . 306 Intro to MeshLab . 313 Making a Mesh from the Kinect Data . 316 Looking at Our First Scan . 322 Cleaning Up the Mesh . 324 Looking at Our Corrected Model . 331 Prepping for Printing . 333 Reduce Polygons in MeshLab . 333 Printing Our Model on a MakerBot . 336 Sending Our Model to Shapeways . 340 Conclusion: Comparing Prints . 342 6.Using.the.Kinect.for.Robotics. 345 Forward Kinematics . 347 Inverse Kinematics . 367 Conclusion . 378 7.Conclusion:.What’s.Next?. 379 Beyond Processing: Other Frameworks and Languages . 380 Topics in 3D Programming to Explore . 384 Ideas for Projects . 388 A.Appendix. 393 Index. 411 vi Contents.
Preface When Microsoft first released the Kinect, Matt Webb, CEO of design and invention firm Berg London, captured the sense of possibility that had so many programmers, hardware hackers, and tinkerers so excited: “WW2 and ballistics gave us digital computers. Cold War decentralization gave us the Internet. Terrorism and mass surveillance: Kinect.” Why the Kinect Matters The Kinect announces a revolution in technology akin to those that shaped the most fundamental breakthroughs of the 20th century. Just like the pre- miere of the personal computer or the Internet, the release of the Kinect was another moment when the fruit of billions of dollars and decades of research that had previously only been available to the military and the intelligence community fell into the hands of regular people.
Why the Kinect Matters While this development may seem wide-ranging and diverse, it can be sum- marized simply: for the first time, computers can see. While we’ve been able to use computers to process still images and video for decades, simply iterating over red, green, and blue pixels misses most of the amazing capabilities that we take for granted in the human vision system: seeing in stereo, differentiat- ing objects in space, tracking people over time and space, recognizing body language, etc. For the first time, with this revolution in camera and image- processing technology, we’re starting to build computing applications that take these same capabilities as a starting point. And, with the arrival of the Kinect, the ability to create these applications is now within the reach of even weekend tinkerers and casual hackers.