Why iPhone 12’s LiDAR Scanner Will Be a Game Changer for Augmented Reality Apps
Tammy Slaughter wrote this article
The highly anticipated iPhone 12 making its debut this fall has been generating some early interest due to a new potential feature: the LiDAR scanner.
Leaked designs show the previously much-discussed tri-camera setup will be gone with the addition of a fourth sensor designed to accurately measure real-world depth — drastically improving performance in augmented reality apps.
This new ‘time-of-flight sensor’ should be a complete game changer for the iPhone 12 models and beyond.
The LiDAR scanner can measure the exact distance to surrounding objects up to 5 meters away at nano-second speeds. This, combined with the same computer vision algorithms in the iPad Pro’s A12Z Bionic processor, should give iPhone users a much more detailed understanding of a scene and also drastically reduce the setup time normally required in AR apps.
Finally, it would seem that augmented reality is about to have its “iPhone moment (if these leaked designs have anything to say about it)!
The Latest iPad Pro Featured a New LiDAR Sensor, Too
So, it would come as little surprise if Apple did reveal another LiDAR sensor on the iPhone 12.
While it is true that in some ways, augmented reality is not necessarily a technology most people are pining for, or are even very familiar with just yet.
You can chalk this up to the high cost to obtain even entry level VR technology today, and that its performance has traditionally always been hamstrung by the 4G LTE network (enter 5G, stage left).
But the fact that it’s one of the headlining features of the new iPad Pro could mean that’s about to change. These dedicated AR sensors may be laying the foundation for what Apple has planned for the iPhone and beyond.
Empowering App Developers To Go All-In On AR
It’s pretty rare for Apple to drop any hints about what they’re interested in as a company before making an official announcement.
But CEO Tim Cook has been talking up augmented reality for years — long before Apple even announced ARKit, the framework it launched in 2017 for developers looking to build AR-enabled apps for the iPhone and iPad.
Apple is launching ARKit 3.5 with a new tool called the Scene Geometry API to help developers take advantage of the iPad Pro’s new LidAR sensor. It will empower apps to create a 3D map of a space, differentiating between floors, walls, ceilings, windows, doors, and seats. This will enable app users to quickly create a digital facsimile for object occlusion — making digital objects appear to partially blend into a scene behind real objects.
Thanks to “Instant AR” support, those digital objects can be placed automatically within a space, without users needing to wave the tablet around and give its cameras the parameters of the space.
The idea is that a furniture app like Ikea should be able to better understand the dimensions of the space next to your couch, so it could more easily suggest decor that fit within that space. The new sensor will also help apps more easily and quickly calculate a person’s height.
The new LidAR scanner will enable more accurate 3-axis measurements.
The best part? It will automatically benefit previously developed apps, without the need for any code changes.
At its core, the new LidAR sensor is all about making augmented reality apps smarter, and we are here for that at Chop Dawg.
Learning from Google’s Past VR Mistakes
Apple is showing some caution (and savvy) amidst their rollout of the LiDAR sensor by releasing it to the iPad Pro before bringing it to the iPhone — arguably its flagship product.
This will help app users to gain a better understanding of what Lidar does and why it matters to them.
Even more importantly, it gives developers time to create incredible new apps harnessing this technology before bringing the feature to the iPhone.
According to Business Insider:
Apple’s decision to launch AR hardware years after it debuted ARKit makes a lot of sense, and could potentially help it succeed where Google has failed in the past when it comes to bringing augmented reality to smartphones.
Google shut down its Tango project, which brought augmented reality tech to a limited number of smartphones through specialized cameras and sensors. In 2018 as it transitioned to ARCore, a tool set that lets developers bring AR apps to existing Android phones without specific hardware.
Ultimately, it makes more sense to bring AR apps to phones users already own, rather than requiring users purchase a specific smartphone to get those capabilities.
This move will also lessen the cost of entry to AR technology for the everyday app user, and allow it to grow exponentially.
Apple seems to have followed this approach with ARKit, and going one step further by waiting until after AR apps had made their way to the iPhone before launching specialized hardware for AR, giving developers a few years to learn how to create AR apps that resonate with iPhone users first.
More Mixed Reality On The Horizon For iOS Apps
Apple’s new iPad Pro is its first gadget built around augmented reality, and the first sign that Apple is creating dedicated hardware around providing high-quality AR experiences, which can tell us a lot about what’s next for Apple’s product line.
This could potentially be laying the groundwork for future products long-rumored to be in the company’s pipeline, like an Apple VR headset and AR smart glasses.
What appreneurs, companies and developers decide to do with that LidAR scanner moving forward will be incredibly important, especially if Apple truly is planning to bring a similar Lidar sensor to its 2020 iPhones (as reports and rumors have so far suggested).
We’ve been on the verge of an augmented reality boom for years, and 2020 is looking like the year it is finally shaping up to happen.
According to Business Insider, the International Data Corporation reported last November that worldwide spending on augmented and virtual reality solutions will reach $18.8 billion in 2020, a notable 79% increase from the $10.5 billion previously projected for 2019.
During the initial ramp up period that growth will be led by the commercial sector, with retail and manufacturing expected to outspend the rest.
But the new iPad Pro could be instrumental in starting to demonstrate the true value of AR and VR to the everyday user, too.
Of course, Covid-19 could lead to a delayed Apple event this year.
Apple reveals its iPhones every fall, usually around September. But the event could be delayed this year in the scenario the design isn’t finished in time, or there’s potential shortages of stock.
There’s nothing definite happening on that front right now, so hopefully Apple can still reveal its next-generation iPhone 12 on time, despite the chaos currently being caused globally by the coronavirus pandemic.
Keep your eyes peeled on that front, and we will be sure to update you on the blog once this long-awaited feature has been officially announced!
Update as of 4/27 – The Verge has reported Apple is running about a month behind on their production for the iPhone 12. No word yet what impact this will have on the rollout this fall.
Update as of 6/4 – Forbes has reported the Phone 12’s production is about 4- 6 weeks behind their usual schedule. There are reports the supply chain is gearing up to start manufacturing the iPhone 12 this July.
Watch Reset Summit FREE Recorded Digital Summit for Businesses & Entrepreneurs
About ChopDawg.com: Since 2009, we have helped create 300+ next-generation apps for startups, Fortune 500s, growing businesses, and non-profits from around the globe. Think Partner, Not Agency.
Follow us on Twitter
Like us on Facebook
Double tap us at Instagram
Connect with us on LinkedIn
Find us on social at #MakeItAppn®