Share this article

Latest news

With KB5043178 to Release Preview Channel, Microsoft advises Windows 11 users to plug in when the battery is low

Copilot in Outlook will generate personalized themes for you to customize the app

Microsoft will raise the price of its 365 Suite to include AI capabilities

Death Stranding Director’s Cut is now Xbox X|S at a huge discount

Outlook will let users create custom account icons so they can tell their accounts apart easier

Microsoft’s Handpose may interpret ASL and control robotic hands in the future

3 min. read

Published onApril 17, 2015

published onApril 17, 2015

Share this article

Read our disclosure page to find out how can you help Windows Report sustain the editorial teamRead more

Tracking hand movement is far more difficult than basic skeletal tracking but that’s exactly what researchers at Microsoft are accomplishing. The system is called Handpose and could revolutionize communication, emergency safety, video games, and more. Microsoft has a blog post by Allison Linn explaining its development and potential uses.

I admit a personal interest in this topic. Some of my best friends are part of the signing community and the idea that a device could translate American Sign Language in real time would change people’s worlds. Handpose is not there yet but the technology could be there some day.

Another use for Handpose is controlling a robotic hand remotely. In a video in their blog post researcher Jonathan Taylor explains how it could be used in dangerous situations where it would be unsafe for rescuers to go.

Handpose allows more realistic use for hand tracking. Linn’s blog post explains that “Handpose uses a camera to track a person’s hand movements. The system is different from previous hand-tracking technology in that it has been designed to accommodate much more flexible setups. That lets the user do things like get up and move around a room while the camera follows everything from zig-zag motions to thumbs-up signs, in real time.”

One more use discussed in Taylor’s video is browsing through pages in a way similar to what’s seen in ‘Minority Report.’ The Kinect brought in some hand gestures and swipe controls but Handpose allows things as specific as finger point.

This technology extends beyond real world application. Having virtually controlled hands would add a new dimension to video games and virtual reality.

Creating such a system is extremely complicated even compared to the Kinect’s body tracking. In comparing the two Taylor explained that “Tracking the hand, and articulation of the hand, is a much harder problem.”

Hands are a complicated thing. With my experience studying American Sign Language I can say from firsthand experience that hands can move quickly, subtly, and gestures and movements are easy to confuse. Handpose has to fight this because as Linn explains “we can move them in subtle and complex ways, which can result in fingers that are hidden from the camera. Even fingers that can be seen are difficult to differentiate from each other.”

Researchers develop Handpose using a combination of 3-D hand modelling and machine learning to understand hand movements.

Radu Tyrsina

Radu Tyrsina has been a Windows fan ever since he got his first PC, a Pentium III (a monster at that time).

For most of the kids of his age, the Internet was an amazing way to play and communicate with others, but he was deeply impressed by the flow of information and how easily you can find anything on the web.

Prior to founding Windows Report, this particular curiosity about digital content enabled him to grow a number of sites that helped hundreds of millions reach faster the answer they’re looking for.

User forum

0 messages

Sort by:LatestOldestMost Votes

Comment*

Name*

Email*

Commenting as.Not you?

Save information for future comments

Comment

Δ

Radu Tyrsina