The new iPad Pro’s LIDAR sensor is an AR hardware solution in search of software – The Verge
Apple announced a new iPad Pro yesterday, and one of the biggest additions was a new LIDAR system on the rear camera, which Apple argued was the missing piece for revolutionary augmented reality applications.
LIDAR — which stands for “Light Detection and Ranging” — isn’t a new technology. Driverless cars in particular have been relying on the laser sensors for years to detect objects and build 3D maps of their surroundings in near real time as a way of “seeing” other cars, trees, and roads.
Apple’s miniaturized scanner isn’t quite at that level, but the company says that it’ll be able to measure the distance to objects over 16 feet away (or five meters). It claims that by combining the depth information from the LIDAR scanner with camera data, motion sensors, and computer vision algorithms, the new iPad Pro will be faster and better at placing AR objects and tracking the location of people.
The new sensor is the latest attempt from Apple to try to make AR a key part of its apps and software, an effort that the company has been working on since at least 2017, when the company first introduced its ARKit platform for developing augmented reality iOS apps.
Since then, there’s barely been an iOS update or iPhone launch that’s gone by that hasn’t featured some sort of over-hyped AR demo, whether it be Minecraft, a multiplayer game, or a cooperative Lego experience.
And with each announcement of software updates or improved processors, cameras, or graphics engines has come the implicit promise: now is the time that augmented reality apps will really take off.
But it doesn’t change the fact that, right now, there still aren’t a lot of compelling reasons to actually use augmented reality apps on a mobile device beyond the cool, tech-demo-y purposes that already exist. AR apps on iOS today are a thing you try out once, marvel at how novel of an idea it is, and move on — they’re not essential parts of how we use our phones. And nearly three years into Apple’s push for AR, there’s still no killer app that makes the case for why customers — or developers — should care.
Maybe the LIDAR sensor really is the missing piece of the puzzle. Apple certainly has a few impressive tech demos showing off applications of the LIDAR sensor, like its Apple Arcade Hot Lava game, which can use the data to more quickly and accurately model a living room to generate the gameplay surface. There’s a CAD app that can scan and make a 3D model of the room to see how additions will look. Another demo promises accurate determinations of the range of motion of your arm.
The fact that Apple is debuting the iPad for AR doesn’t help the case, either. While Apple has been rumored to be working on a proper augmented reality headset or glasses for years — a kind of product that could make augmented digital overlays a seamless part of your day-to-day life — the iPad (in 11-inch and 12.9-inch sizes) is effectively the opposite of that idea. It’s the same awkwardness of the man who holds up an iPad to film an entire concert; holding a hardcover book-sized display in front of your face for the entire time you’re using it just isn’t a very natural use case.
It’s possible that Apple is just laying the groundwork here, and more portable LIDAR-equipped AR devices (like a new iPhone or even a head-mounted display) are on their way in the future. Maybe the LIDAR sensor is the key to making more immersive, faster, and better augmented apps. Apple might be right, and the next wave of AR apps really will turn the gimmicks into a critical part of day-to-day life.
But right now, it’s hard not to look at Apple’s LIDAR-based AR push as another hardware feature looking for the software to justify it.