Do Lots of Cameras Make Your iPhone or Samsung Galaxy Smartphone Any Smarter? – The National Interest Online
Remember was when smartphones had digital cameras? These days, it seems more like we all carry around digital cameras that, in an emergency, can also make phone calls. Like fashion photographers on a shoot, we accumulate so many pictures that we never even look at most of them. The ones that make the cut get shrunk to one-twentieth their original size when we post them to social media. No one wants your 10-megapixel photo, unless you happen to have caught a terrorism suspect reflected in a store window 100 yards away, CSI-style.
But the smartphone camera arms race shows no signs of letting up. The 2019 iPhone 11 Pro has three rear-facing (picture-taking) cameras; the new Samsung Galaxy S20 has four. Plump for the S20 Ultra, and you can really annoy your friends by sending them your 108-megapixel photos. Well, you’ll have to send them links, since Gmail won’t even take photos over 25 megabytes. And if four cameras aren’t enough for you, the five-camera Nokia 9 has already been on the market for more than a year.
What’s so smart about all those cameras? Not much — yet.
But the combination of multiple and ultrawide cameras found on the new flagship phones is a built-in toolkit for future augmented reality (AR) applications. Most people are already familiar with virtual reality (VR), which is fast becoming standard-issue for new video games. While VR immerses users in a wrap-around computer-generated environment, AR imposes computer-generated elements on top of real-world images.
Science fiction movies have long depicted AR as floating images stamped on lenses that can be worn like ordinary eyeglasses. A company called Mojo Vision even claims that it will soon be able to do AR on a contact lens. A primitive version of this kind of AR already exists in Google Glass and Microsoft HoloLens, which are more like wearable computers than true AR platforms. They’ve proved more useful as wearable webcams than as tools for placing their wearers into a blended reality of natural vision and computer-generated imagery (CGI).
The new tri- and quad-lens smartphones have the potential to flip the industry-standard approach to AR. Their cameras record images that are both wide and deep, much more three-dimensional than a typical snapshot. Snap one of these phones into a VR headset, and you’ll be able to walk around quite comfortably, viewing the world through your smartphone’s wide-angle, depth-perceiving cameras. The front-facing (selfie) camera could even be programmed to track your eye movements, bringing distant objects into focus as you shift your attention around the screen.
Then instead of adding CGI on top of a natural image, these new phones will be able to insert CGI into your camera image. It will be like a VR overlay on the real world. Forget about drawing in the air with your index finger or watching teeny tiny YouTube videos out of the corner of your eye. These phones will be able to walk you through architectural drawings, displaying yet-to-be-installed walls and wiring over the raw girders of a building under construction. And you’ll be able to walk through the entire construction site without tripping over so much as a stray rivet, because you’ll be seeing everything there is to see — through your phone display.
Back when VR was no more than fantasy, people imagined that it would be delivered through some kind of yet-to-be-invented holographic technology. It turned out that a smartphone with a screen and some motion sensors was all you needed. Now we’re going to see the same kind of evolution with AR. It won’t be projected onto high-tech glasses, and it certainly won’t require advanced contact lenses or corneal implants. It will go live with a smartphone in a headset — but a smartphone with some really cool cameras.
Salvatore Babones is an adjunct scholar at the Centre for Independent Studies and an associate professor at the University of Sydney.