Security Innovation: Shoe RecognitionBy Tim Moran | Posted 2012-03-14 Email Print
WEBINAR: On-demand webcast
Next-Generation Applications Require the Power and Performance of Next-Generation Workstations REGISTER >
Have you considered that your footwear could be used for computer security? Researchers at the Human-Computer Interaction group at Germany’s Hasso Plattner Institute have.
There’s been quite a bit of talk about facial-recognition technology, which automatically identifies an individual user by comparing selected facial features to a database of such features. It is often used as a security mechanism, but now it’s making its way into marketing.
For instance, Facebook is testing a new feature called "Tag Suggestions," which scans images uploaded by users and matches those images with existing users through facial recognition software. This can help Facebook build a detailed demographic for individuals.
But have you looked at your shoes lately? And have you considered that your footwear might be able to be used for computer security? Researchers at the Human-Computer Interaction group at Germany’s Hasso Plattner Institute have—and, as a result, they’ve created Bootstrapper. They believe that, instead of focusing on the face, they can zero in on shoes to identify users of touch-based tabletop computers, such as Microsoft’s Surface.
According to the research team: “While users are interacting with the table, Bootstrapper observes their shoes using one or more depth cameras mounted to the edge of the table. It then identifies users by matching camera images with a database of known shoe images.”
Using a prototype made with Kinect, the German group explains that Bootstrapper associates touches to the computer with shoes based on hand orientation. This works, they say, because shoes offer large distinct features, such as color, and shoes naturally align themselves with the ground, giving the system a well-defined perspective, which reduces ambiguity. They report a simple study in which Bootstrapper recognized participants from a database of 18 users with 89 percent accuracy.