This article is from the WeChat public account: Intelligent Things (ID: zhidxcom) , author: Lee green water, cold dawn

Not long ago, Apple’s new spring products were unveiled directly on the official website. Without the legendary iPhone 9, several new products are mostly regular upgrades, including products from the iPad Pro, Mac mini, and MacBook Air families.

But there is one product that has to attract our attention, and that is the 2020 version of the iPad Pro, not because it added a “dual camera”, but the “Lidar” component it first introduced. On Apple’s official website, a considerable amount of introduction was made to this part. Apple sees this as an important step towards revolutionary AR applications. In the past, the new technologies that Apple has adopted radically many times have changed the entire industry, Touch ID, Face ID structured light lenses, SiP packaging of AirPods, etc .; for this reason, it is necessary to take a deep dive from the technology itself to the application and industry chain. The mystery behind lidar.

Apple’s official website introduction of the iPad Pro lidar component

The “Lidar” has been mentioned more in the field of autonomous driving in the past two years, so when I saw the news that Apple has installed Lidar in the iPad, many people doubt that Lidar is not “so big”. , Mostly used in automaticDriving a car or a sweeping robot? Is Apple stealing the concept?

In fact, it is not the same thing. In the exchange with many industry practitioners, I learned that the iPad Pro lidar is still behind ToF (Time of flight) Three-dimensional vision technology solutions, in fact, Huawei, Samsung, OPPO, vivo and other Android factories have already played. For example, Huawei ’s smartphones have used ToF lenses to achieve the control screen in the air, but the ToF lens technology used in the iPad Pro is very different from those used by these manufacturers.

Note that the ToF technology used on Apple iPad Pro has a prefix “d”, dToF. Using dToF (direct measurement of time of flight method) , iPad Pro is expected to improve the accuracy and fluency of AR application effects to a new level, while at the same time Consumption is greatly reduced. Some people say that Apple may be the first consumer hardware company to commercialize dToF technology.

Using the iPad Pro’s radar system for augmented reality

It is understood that the application of the dToF lidar on the iPad Pro is just the beginning. The iPhone may also be used soon. It is very likely that this year, as early as two years ago, Apple asked engineers to study the rear three-dimensional depth. camera.

From the first launch of ARKit in 2017, Apple has been exploring the application of AR in mobile terminals. The addition of dToF Lidar is another important innovation around AR applications in hardware after the structured light lens module. How exactly is Apple’s approach different, and what technologies and supply chains are involved? Through communication with many practitioners, let’s take a lookWhat medicine is sold in the iPad Pro lidar gourd, and what Apple’s AR ambitions are behind it.

Is the iPad Pro equipped with lidar, an old selling point or a new change?

When it comes to lidar, many people think of self-driving cars. Lidar builds a three-dimensional map of the surroundings in a near-real way, so that cars “see” other cars, trees, and roads.

RoboSense co-founder and COO Mark of the autonomous driving field tells Zhiwu that it’s two different things. Vehicle lidar mainly guarantees safety, has a long recognition distance, and has high performance requirements. Compared to the size of men’s wallets Car lidar, lidar on iPad Pro can recognize several meters, different detection standards, and lower cost.

According to Mark’s speculation, the lidar on the back of the new iPad Pro will most likely use the Flash solution. At present, solid-state lidar has roughly formed three technical routes: MEMS (micro-electromechanical systems) , OPA (Optical Phased Array Technology) and Flash (Flash) According to professional knowledge, MEMS and OPA are both scanning, while Flash is non-scanning. To put it simply, Flash is a fast and direct launch of a large wall of light to obtain depth information through the time-of-flight method. In a sense, it is similar to a night vision camera, but the light source is actively launched by itself.

Since there is no mechanical “scanning” action, why is Apple’s official website promoting “radar scanners”? Compared to the other two technical routes, Flash does not rely on moving mechanical parts, but rather digital signals from electronic parts to control the laser emission angle. Therefore, the “scan” in Apple’s promotional pages may describe the process of figurative perception rather than mechanical operation.

So, what’s the difference between Apple’s lidar and ToF depth cameras launched by Huawei, OV, Samsung, Xiaomi, etc.? In fact, Lidar is the sensor hardware, and ToF is the method of laser ranging, which points to one thing.

ToF module for OPPO R17 Pro

ToF (Time of Flight) is an all-inclusive term that refers to the passage of light, gas, liquid, etc. A technique for calculating the distance of particles with a speed of over 100 years. As early as 1887, Michelson first proposed light waves as a ruler for measuring length. In the 1960s, the invention of lasers made “distance measurement with light” a reality.

Some applications of ToF camera modules in domestic smartphones

3D modeling with Samsung Note10 series phones

In addition, Apple only said that the measurement distance is five meters, which is not very far. At the same time, what is the recognition effect within 5 meters and how is the interference resistance? I don’t know yet, I believe that after the product is launched, users will have an intuitive experience.

From this we can see that there are similarities and differences between the dToF solution on Apple iPad Pro, the iToF solution and car lidar used on previous Android phones:

1. All three use optical sensing and ToF distance calculation methods.

2. Compared with on-board lidar, the dToF solution of the iPad Pro is a fixed flat 3D sensor and the latter is a single or multi-line rotation. At the same time, the Flash method has a short sensing distance in automotive applications The problem is appropriate in the scene set by Apple.

3. Compared with the iToF solution of Android, the dToF solution of iPad Pro emits different pulse signals, so the sensor components and algorithms are also different.

Now we will further dissect the differences between dToF and iToF.

What is the technical secret of an evolutionary ToF camera?

Although they are all ToF measurement methods, this time Apple uses dToF to directly measure the time of flight method. (direct time of flight) , which is different from the iToF indirect time of flight measurement method used in Huawei Mate30, vivo NEX, and other models. / span>.

The main differences between dToF and iToF


Schematic diagram of iToF and dToF measurement methods

It can be seen from the chart that dToF is much better than iToF in terms of performance, but it is also more complicated in production technology.

Due to the dToF method, the lidar on the back of the iPad Pro will be able to solve the low time resolution problem faced by direct ranging. / span> level time resolution, the accuracy will not have a large attenuation with distance. In other words, the performance of the ToF camera of the iPad Pro will not be improved a little bit, it brings an evolutionary version of the ToF lens, the accuracy and fluency of the AR experience will be greatly improved, and power consumption will be greatly reduced.

AR game experience with more realistic and smooth visual effects

Meanwhile, dToF mostly relies on array SPAD (single photon avalanche diode) and other components, and the SPAD process is very complicated There are few manufacturers that can do it, and the integration is difficult. This is also a major reason why no manufacturer has previously inserted the dToF solution into mobile phone and tablet tired products. Medium, but the module size is much larger than on the iPad Pro.

Here is a quote from a knowledgeable person, Zhang Eric, on the dToF key component SPAD:

Avalanche diodes are the key to dToF. The first one from the left below is the PD for general imaging. The second is APD (Avalanche Diode) , the third is SPAD (single Photon avalanche diode) , The main difference between APD and SPAD is that by increasing the bias voltage, the avalanche effect will occur when the photon reaches the photodiode, which is to generate a large number of electrons. This is not good for imaging, but it is good for detecting pulses. SPAD is more sensitive than PAD, that is, it requires only a few photons to produce an avalanche effect.

The miniaturization of SPAD is one of the technical bottlenecks that dToF is difficult to use in small consumer electronics such as mobile phones. Zhang Eric believes that dToF is not the firstUse on fee-based equipment. The earliest dToF sensor was also used in mobile phone’s back-focused laser focusing and Apple Face ID ranging. However, the difficulty of miniaturization that is better than SPAD makes it difficult to improve the resolution of the sensor. The iPad Pro ’s lidar has made a breakthrough in this regard. The resolution is higher than the previous dToF on the device. He estimates that Apple ’s lidar resolution should be QVGA ~ VGA (320 * 240 pixels to 640 * 480 pixels) .

SPAD single photon avalanche diode element functional characteristics

Since miniaturization of key parts is so difficult, how does Apple do it? Some analysts believe that Apple is a wafer-level stitching of avalanche diodes and digital circuit parts that are specifically used for photosensitivity. It seems that this is likely to be another example of Apple’s push for supply chain technology.

Who will be the winner behind the iPad Pro lidar?

Who is the industry chain supplier behind the lidar of the iPad Pro? Will it be Lumentum that provides Apple with Face ID technology? Face ID also uses three-dimensional vision technology to obtain face depth information, but it uses structured light three-dimensional vision technology, which is three different technical directions with ToF and RGB binocular vision. It also has the lidar-ToF solution on the iPad Pro the difference.

At the same time, according to the foreign media Fast Company reported last week, Lumentum said that it has not discussed with Apple the use of its 3D vision technology in unreleased devices.

Intention to provide sensors for iPhone 7 Plus (ST) participated? The company is the first company in the world to provide SPAD-based technology and has not responded.

It is also understood that Apple has obtained a “Light Detection and Ranging Sensor” license in 2019. The Trilumina mentioned above may also participate in the lidar design of the iPad Pro.

A senior professional in the lidar industry chain analyzed that the supplier of the iPad Pro back radar SPAT sensor is nothing more than Sony. (Sony) , STMicroelectronics (ST) , LG innotek and several other companies. It is speculated that the new iPad Pro is likely to use Sony’s back-illuminated stacking technology and LG’s camera module factory in Daegu, South Korea to condense the lidar into such a small space.

Are there any domestic players? The professional said that domestic sensors still have a long way to go, and domestic modules and lenses will gradually enter after a period of preparation. Some insiders also revealed that before listening to the news of the industry chain, Apple would not be able to launch dToF until 2021. At present, it may take two years for domestic mobile phones to achieve this level.

ToF solution providers in Huawei Mate 30 Pro have many domestic manufacturers

Behind the marketing selling points, Apple’s AR ecological application building

If the lidar of the iPad Pro is an evolutionary version of the ToF camera, then we have to return to the old-fashioned topic.

Why is Apple the first to eat crabs, and is the application ecology ready?

From the existing products on the market, the application of rear depth sensors has three main directions:

The first is the optimization of camera photography, which can make the depth of field blur and the beauty better. The second is the AR experience, such as person recognition. (such as Applications in live video) , MR games, body measurements, AR rulers, Emoji expressions, 3D modeling, AR navigation, etc .; Third, more complex functions such as face unlock, gesture operations, etc.

Using AR for navigation

Overall, these applications are not hard enough. In applications where face unlocking and gesture recognition are relatively hardcore, the ToF lidar solution is actually easily replaced by millimeter wave radar, structured light 3D vision and other solutions.

The “application fatigue” problem is straight to the “entry point” and “landing point” of ToF 3D vision technology. If the demand for 3D vision applications cannot be “hardened”, whether it is lidar or ToF cameras, it is a marketing selling point that has been changed in the technology circle and updated every year.

However, is there any other surprise that Apple is brewing? Probably.

Why did Apple choose to launch the latest ToF camera on the iPad? Some people say that the module may not be small enough to be integrated into a mobile phone.

But last week, the foreign media Fast Company reported that at least oneIPhones will have a ToF 3D depth camera on the back. It is said that iPhone engineers have spent at least two years working on rear three-dimensional depth cameras, and lighter and more portable AR glasses are already on the way.

In addition to technology, Apple itself is also advancing the development and empowerment of AR applications. As early as three years ago, Apple launched the AR development platform ARKit, but it has been slightly low-key. Apple’s AR layout has been for a long time, it can be described as “soft and hard”, and there is a look of long-line fishing for big fish.

Drunkenness is not about wine. Apple is now making new attempts on sensor hardware, but it is fancy to bring AR application and software ecosystem to a new level. In the three years that Apple has promoted AR, no “killer” applications have appeared. The progress of lidar sensors is to solve this problem.

In the iPad Pro lidar sensor application display, Apple has been working hard to show the power of its AR application. For example, its Apple Arcade Hot Lava game can use data to model living rooms faster and more accurately to generate game surfaces; there is a CAD application that can scan a 3D model of a room and see its appearance; another demo can accurately determine the arm Range of motion.

Playing realistic online games with AR

Using AR to simulate the ecosystem

Using iPad Pro for more learning and work functions

According to 9to5Mac’s Benjamin Mayo, Apple is currently developing an AR application for iOS 14. Users can point the iPhone at Apple Store and Starbucks merchandise to see more information about the merchandise on the phone screen. Appears as an AR stack.

Xia Chong ca n’t talk about ice. Apple ’s vision is far from being a selling point for marketing, but a new outlet for consumer AR. At that time, will the phone become an AR device, and will Apple become an AR company?

Conclusion: Behind a camera is Apple’s AR ambitions

Apple’s iPad Pro launches lidar. It’s not surprising at first glance. It is more like an evolutionary ToF camera. Although this selling point was played by major Android manufacturers last year, the arrival of Apple may ignite this technology application again.

Back in 2016, when Apple introduced the AirPods headset, it did not attract the attention of the industry. However, the development in the following years has directly led to a TWS headset market with annual shipments of more than 100 million units. Apple has therefore created a new category of revenue that surpasses Mac computers. Even following TWS’s east wind, the stock price has risen. This application of dToF technology started by iPad Pro, if it can be widely used, will definitely bring new opportunities to the upstream supply chain.

In the application layer, it is possible to spawn a real AR killer application. After all, Apple has already been in the AR direction.Laid the foundation for three or four years. From the first launch of ARKit in 2017 to the development of AR applications by developers, such as the IKEA furniture purchase app, Apple has been the most active explorer in the AR field.

Jump out of this layer, Apple also highlights far-reaching concerns behind the deployment of lidar on the iPad. With the arrival of iPhones with rear ToF cameras, will AR glasses and lightweight MR devices be far behind? Together with the two-pronged approach of Apple’s AR application ecology, will Apple’s AR building be built higher and higher? There is still much room for imagination.

This article is from the WeChat public account: Intelligent Things (ID: zhidxcom) , author: Lee green water, cold dawn