"The Midas Touch" - Human-Computer Interaction
Project Leader: Professor Dan Vogel
The hope for camera-based input devices like the Microsoft Kinect and LEAP Motion is that interacting with computers becomes more natural. But in practice, waving your arm and pointing your finger to navigate something like Netflix can leave a lot to be desired. In this mini-project, we’ll work on a fundamental Human-Computer Interaction problem associated with all computer vision-based input, the “Midas Touch Problem.” Like the Greek Myth where everything King Midas touches turns to gold, with computer vision, everything you do may be interpreted as input; even if you’re just waving at a friend. The challenge is to find techniques that work well with noisy computer-vision tracking, aren’t too tiring, and are easy and fast to perform. We’ll review current approaches and go through a mini bootcamp for computer vision coding so we can prototype and test new interaction techniques — and experience first hand what Human-Computer Interaction research is like.