Imaging Boards and Software

Microsoft researchers looking into broader applications of hand tracking technology

Researchers at Microsoft are looking into the ways in which hand tracking and gesture recognition technology can be used in a wide variety of fields. 
June 27, 2016
3 min read

Researchers atMicrosoft are looking into the ways in which hand tracking and gesture recognition technology can be used in a wide variety of fields.

The ultimate goal of the research, according to Microsoft, is to allow people to interact with the technology in more natural ways than ever before.

"How do we interact with things in the real world? Well, we pick them up, we touch them with our fingers, we manipulate them," said Jamie Shotton, a principal researcher in computer vision at Microsoft’s Cambridge, UK, research lab. "We should be able to do exactly the same thing with virtual objects. We should be able to reach out and touch them."

Microsoft notes that these technologies are still evolving, but the computer scientists and engineers working on these projects believe they are on the cusp of making hand and gesture recognition technology practical enough for mainstream use, similar to how computer vision technology is used torecognize faces in photos.

"If we can make vision work reliably, speech work reliably and gesture work reliably, then people designing things like TVs, coffee machines or any of the Internet of Things gadgets will have a range of interaction possibilities," said Andrew Fitzgibbon, a principal researcher with thecomputer vision group at the UK lab.

Fitzgibbon and his fellow researchers believe that—in order for this vision to come to fruition—the technology must track hand motion precisely and accurately, using as little computing power as possible, which will allow people to use their hands as they normally would, and for consumer gadgets to respond accordingly.

Microsoft’s computer vision team has developed a project calledHandpose combines breakthroughs in methods for tracking hand movement with an algorithm that dates back to the 1940s, when computing power was less available, and a lot more expensive. The system tracks hands quickly and accurately, and can run on a regular consumer gadget.

"We’re getting to the point that the accuracy is such that the user can start to feel like the avatar hand is their real hand," Shotton said.

Additionally, researchers and engineers in Microsoft’s Advanced Technologies Lab in Israel are investigating ways in which developers could create tools that would enable people to communicate with their computer, utilizing the same types of hand gestures used in everyday life. The goal of the project, called Project Prague, is to provide developers with basic hand gestures, such as one that switches a computer off. The system utilizes machine learning to train systems to recognize motions, and runs using a retail 3D camera.

Learn more about Handpose and Project Prague on theMicrosoft blog.

Share your vision-related news by contactingJames Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox, click here.

Join ourLinkedIn group | Like us on Facebook | Follow us on Twitter

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!