While Microsoft has made Kinect available for Windows developers for a while now, the company has never really created a way to use the motion sensor so it can actually control a Windows PC with hand gestures. That situation has now changed thanks to a new Microsoft Research project in Cambridge.
According to ZDNet, who got to see a demo of the project earlier this week, the research team have created ways for the Kinect sensor to help user perform a number of actions on both Windows 7 and 8 with hand movements. The story claims it took the researchers 18 months to develop the software for these features.
The software can now allow a person to use a number of different hand gestures to perform actions on Windows. For example, a user can maximize a window on the screen by clenching a fist in the air to "grab" the window and then moving the hand towards the top of the keyboard. Another example uses the pinching of the thumb and forefinger to bring up the Windows 8 search menu.
The prototype shown this week had the Kinect sensor placed right above a keyboard. However, the article also adds that the Microsoft Research team have another system where the Kinect sensor is at a 45 degree angle above the keyboard. This might allow such a sensor to be put on top of or inside a monitor.
The effort sounds similar to the Leap Motion device, which launched earlier this summer and allows users to interact with Windows 8 via motion gestures. Unfortunately, there's no word on when, or even if, Microsoft's similar research efforts will result in an actual Kinect-based Windows PC.
Source: ZDNet | Image via ZDNet