Apple patents AR touch detection using depth-mapping cameras and ML

03

As the iPhone and iPad have amply demonstrated, much of Apple’s current hardware depends on accurate detection of direct touch inputs — a finger resting against a screen, or in the Mac’s case, on a trackpad. But as people come to rely on augmented reality for work and entertainment, they’ll need to interact with digital objects that aren’t equipped with physical touch sensors, so Apple has today patented a key technique to detect touches using depth-mapping cameras and machine learning.

By patent standards, Apple’s depth-based touch detection system is fairly straightforward: External cameras work together in a live environment to create a three-dimensional depth map, measuring the distance of an object — say, a finger — from a touchable surface, then determining when the object touches the surface. Critically, the distance measurement is designed to be usable even when the cameras change position, relying in part on training from a machine learning model to discern touch inputs.

Illustrations of the technique show three external cameras working together to determine the relative position of a finger, a concept that may seem somewhat familiar to users of Apple’s triple-camera iPhone 11 Pro models. Similar multi-camera arrays are expected to appear in future Apple devices, including new iPad Pros and dedicated AR glasses, enabling each to determine finger input simply by depth mapping a scene and applying ML knowledge to judge the intent of changes in the finger’s position.

Armed with this technology, future AR glasses could eliminate the need for physical keyboards and trackpads, replacing them with digital versions only the user can see and properly interact with. They could also enable user interfaces to be anchored to other surfaces, such as walls, conceivably creating a secure elevator that could only be operated or brought to specific floors by AR buttons.

Patent US10,572,072 was granted today to Apple, having been invented by Sunnyvale-based Lejing Wang and Daniel Kurz. It was originally filed at the end of September 2017, and atypically for Apple includes photos of actual testing of the technique, indicating that the company’s AR and depth camera research is ongoing and isn’t merely theoretical. Apple CEO Tim Cook has suggested that AR will be a major business for the company going forward, and reports have offered various timetables for the release of dedicated Apple AR glasses.

Leave A Reply

Your email address will not be published.