LiDAR Is the iPhone 12 Pro’s Secret Weapon

Without the right app and task, Apple iPhone 12 Pro’s LiDAR scanner is little more than an oddity

The LiDAR sensor is the smallest black circle. Photo: Lance Ulanoff

New technology is nothing but an abstraction until you use it. For most people, the iPhone 12 Pro features just such an abstraction: LiDAR or Light Detection and Ranging. Apple started adding these sophisticated sensors to its iPad Pro line last spring and then to its iPhone 12 Pro line in October.

My experience with LiDAR goes back to 2004, the first time I saw self-driving car technology at Carnegie Mellon University. Researchers there were proudly displaying the first car to autonomously traverse a remote, 142-mile desert course and “win” the DARPA Grand Challenge (no one completed the race, but CMU’s Red Team went the furthest). It was at CMU that roboticists explained to me how the car used, among other sensors, LiDAR, which bounces the laser light off surfaces and then measures the time the light takes to return to the scanner, to create an accurate 3D rendering of the space and, for the CMU vehicle, a virtual map for it to navigate.

Since then, I’ve seen LiDAR on a variety of different vehicles including, once, an Apple Map car that, I’m guessing, was using it to create precise visual maps of city streets.

I’ve noticed, however, that despite its existence on the iPhone 12 Pro, Apple doesn’t mention this expensive piece of hardware in iPhone 12 Pro commercials. Tech media reviewers know that the LiDAR scanner assists the iPhone 12 Pro and Pro Max in focusing in low light situations. Basically, it can see in the dark, making portrait mode photography possible at night and in low light.

During the iPhone 12 unveiling last fall, Apple said the scanner would take augmented reality (AR) to the next level by building room scans and allowing for precise placement of AR objects.

When consumers think of AR, though, it’s mostly cute Snapchat filters that wrap to the contours of their face and, perhaps, some games that allow them to put realistic dinosaurs or Lego virtually in their home.

Fun, but not necessarily useful. LiDAR, however, has the potential to be so much more than a toy.

If you ever want to tax the iPhone 12 Pro’s powerful A14 Bionic CPU, create a complex 3D scan.

Recently, my wife and I redid our den, tearing up old flooring, repairing and fixing walls, and panting everything white or slate blue. We also threw out all our old furniture and have been struggling ever since to find the right replacement pieces.

On a recent shopping trip, we finally found a loveseat that we thought fit the color, size, style, and fabric we want. We pulled off the throw pillows the salespeople piled on the furniture and my wife started taking iPhone pictures. As she did, I remembered the iPhone 12 Pro and LiDAR scanner in my pocket.

When I first got the phone, I searched for apps that took advantage of the powerful scanner and found the simply named 3d Scanner App. My first experience with it, trying to scan my poorly lit living room, was underwhelming but I thought it might be useful here in the showroom. I opened the app and chose “Low Res” scan. The results were terrible. I switched to “High Res” and, after hitting the record button, started moving around the loveseat, basically capturing as many contours of the loveseat, which appeared on screen as sort of a purple mesh fabric over every surface of the loveseat and showroom.

The 3D Scanner app scanning screen. Credit: 3D Scanner App

As my wife watched with increasing impatience, I walked in front and around the loveseat, at one point blocking the way of other shoppers, who laughed off my oddball activity while my wife gritted her teeth and told me to move it.

I tapped the record button again and the app began calculating. If you ever want to tax the iPhone 12 Pro’s powerful A14 Bionic CPU, create a complex 3D scan. To be fair, the scan consists of 347,000 vertices and 116,000 faces. It took almost 30 seconds to render and add the proper texture (a faithful recreation of the furniture’s fabric and color), but by the end, I had a realistic 3D rendering of the loveseat that I could spin and resize.

I noticed that the scan also included bits of the end tables beside the couch, as well as the floor below it, and the wall behind it. I started digging into the 3D Scanner App and found “Edit Scan” hidden under “More” (evidence that this is a powerful, if somewhat poorly designed, app).

The editor on the left and my textured scan on the right. Credit: 3D Scanner App

The editor put my scan in a 3D plane that reminded me of a virtual set from the movie Tron. It’s a light gray grid on a sea of darker gray, with blue and red x- and y-axis running through the center of my scan. After some noodling around, I realized I could use the editor to crop the original scan on the x-, y-, and z-axis to remove extraneous material, including the wall, floor, and end tables. I repeated this process with a few other potential loveseat candidates.

At home, I opened the app and then used the AR button to place my scans in position in the den. A readout above the AR loveseat showed me if I inadvertently scaled the couch above or below 100% (it would be pointless to show a mini or larger loveseat in the room). When I showed my wife how she could not only see the loveseat in place but that we could walk around it, she was impressed. (I know this because she shared the news with my mother-in-law.)

It looks pretty good in this space. Credit: 3D Scanner App

The app let me rotate and place the couch anywhere in the room. I noticed how the 3D model accurately reflected my real den’s lighting conditions. The app also makes use of Apple’s ARKit people occlusion, which meant my wife could walk in front of the virtual couch and fake sit on it. I even took the AR loveseat and placed it on top of my living room couch to see relative size differences.

My wife sitting on her virtual seat. Credit: 3D Scanner App

I know there are apps, like IKEA Place, that already let you place AR furniture in the home, but I wasn’t shopping there, and not every retailer has an app or the resources to create 3D renders. A pocket LiDAR scanner like the one on the iPhone 12 Pro and the right app, though, means they don’t need to. Sure, the process is still a bit DIY, but it also wasn’t that hard to figure out and even my wife admits that the results are kind of amazing.

Now, how does Apple put that in a commercial?

Tech expert, journalist, social media commentator, amateur cartoonist and robotics fan.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store