AR is about to usher in a killer application
Editor’s note: This article comes from WeChat public account “qubit” (ID: QbitAI ) , author: Xiao investigation. p>
Just this morning, Apple released the iOS / iPadOS 13.4 update. In addition to the regular upgrade, there is a seemingly inconspicuous update: ARKit 3.5 has been added. p>
If you followed the iPad Pro released last week, you must have guessed that ARKit 3.5 is for its lidar. p>
p>
From 2017, Apple has been promoting AR (augmented reality), but only games such as MineCraft at the press conference were impressive and few manufacturers followed up. Magic Leap, who once developed AR glasses, is also on the verge of selling. p>
p>
AR seems to be far from the day when large-scale practical applications will be made, but the advent of the iPad Pro Lidar may change this situation. p>
About ARKit 3.5 h2>
In today’s updated ARKit 3.5, Apple has added a new “Scene Geometry” API that allows developers to create 3D maps of rooms using lidar scanners on the iPad Pro, identifying floors, walls, and ceilings , Tables, chairs and sofas. p>
p>
It can accurately measure the length, width, and height of objects within 5 meters, perceive the direction of each plane of the object, and even find the occlusion relationship between objects, so that the digital content and scene embedded in the video are better. mix together. p>
p>
This brings a qualitative leap to the rangefinder App experience that comes with iOS. p>
The previous iPad and iPhone used the depth of field of the camera to sense the distance of the object. In fact, the rangefinder photos do not contain real 3D information. p>
Everyone who has used the rangefinder knows that, for example, when we measure the length of a table, the start and end points of the rangefinder are difficult to accurately place on the edge of the table. This compromises our measurement accuracy. p>
p>
The rangefinder app on the iPad Pro is very different from other Apple devices because of the existence of lidar. p>
The iPad Pro, which is equipped with lidar, can automatically recognize the edge when it reaches the end of an object such as a table to be measured. p>
p>
△ Pictures are transferred from Webbus video p>
And if you get closer, more details will be shown on the screen, such as a new ruler view that appears automatically. p>
Digital reviewer Weibus has a detailed introduction to the iPad Pro’s lidar in his review video: p>
Although Apple did n’t mention it specifically, lidar will also improve the camera’s shooting effect. In the future, when using portrait mode, the background blur will be more accurate. p>
Lidar on iPad h2>
Lidar has been widely used in self-driving cars for 3D imaging of objects. Apple’s self-driving secret project, the Titan Project, has also used lidar. p>
A research paper published by Apple in 2017 details 3D object recognition systems in autonomous vehicles. This system uses the depth map of lidar and combines it with a neural network to greatly improve the ability of autonomous vehicles to recognize the environment. p>
p>
Apple’s self-driving car plan is now stranded, but lidar is being used on mobile devices. p>
So what’s so special about this lidar on the iPad Pro? p>
This LiDAR on the iPad Pro is essentially a time-of-flight sensor (ToF). ToF is now used by major mobile phone manufacturers to focus on cameras. p>
But the ToF used by Apple is different from other manufacturers. It is a sensor dToF that really calculates distance based on time of flight. p>
Other manufacturers use the phase difference to judge time and distance based on the phase interference between the emitted light and the reflected light. p>
p>
It calculates the distance based on the time difference of the reflected photons. This ToF can measure in nanoseconds (billion minutes)(One second) speed operation, higher accuracy and lower energy consumption. It is this sensor that makes AR applications possible. p>
Taking photos with the iPad looks rather clumsy, but this year’s iPhone may be equipped with lidar. If Apple’s AR glasses plan is put into practice, then Apple’s AR glasses will definitely be assisted by the future. p>
Happily, the foreign media AppleInsider broke the news that the iOS 14 code shows that this year’s iPhone will join Lidar. It’s just that the three-shot arrangement behind the iPhone doesn’t seem to leave enough space for the lidar. Does the iPhone’s camera layout need to be changed? p>
Reference link h3>
Apple updates ARKit 3.5: https://venturebeat.com/2020/03/24/apple-releases-arkit-3-5-adding-scene-geometry-api-and-lidar-support/ p>
Weibos evaluation of the new iPad Pro (thanks to Weibs for reprinting): https://mp.weixin.qq.com/s/_23UXcD4KogAZBlEAJbV-g p>
The author is the signing author of NetEase News · NetEase’s “Everyone’s Attitude” p>