Apple has introduced the iPhone 12. It has a new feature: LiDAR, which can be used to do many things. Apple is using it to improve the Augmented Reality (AR) capabilities of iOS and as a way to help focus in low light for the camera on the phone.
Google is also very good at making AR. You might have heard about Google’s Tango app that makes AR for phones. Tango is a company that made AR. It only needed special sensors, but it shut down. Then we got ARCore and we could get as much AR as we want without the need for any special hardware.
Some Android phones use a time-of-flight sensor to improve the focus in all conditions. A ToF sensor measures the time it takes for light to bounce back. This is then turned into a distance using some math. Software and machine learning have mostly done away with ToF sensors, but they are still used in some situations.
Apple has brought special sensing equipment back to the hardware side. They also have done the reverse and taken it off of the software side. Should Google do the same?
What does LiDAR stand for?
LiDAR stands for Light Detection and Ranging. LiDAR is a way to create 3-dimensional maps.
A LiDAR system is a laser and receiver. The laser shoots out light that bounces back to the receiver. This is a system that has been in place for a while. You might have seen it if your vacuum cleaner used it to clean up your floor, and the National Oceanic and Atmospheric Administration use this to create maps of the earth. It is no surprise that Apple is using this technology too.
The 2020 iPad Pro has a new type of camera. It is like a laser camera. It can make an image of what the camera sees. This means that it can show animations and static images on the screen in the right place, like if you want to see them on your desk or on top of a picture frame.
This looks like something that would be cool. You don’t need to use sensors for this, but Apple wants you to. Engineers who made the new iPad designed it. I agree with them, so I will say that a LiDAR sensor makes things better.
Project Tango and ARCore
In 2014, Google started a new project called Project Tango. It was a side project from the ATAP team and it was just as cool. The first devices that had this technology were made by Lenovo and ASUS, and they worked just as well as they did in the lab: with a Tango device like the Lenovo Phab 2
In 2017, Google closed Project Tango. In its place, it introduced ARCore to the Pixel 2 phone. You can use ARCore in many different apps on your phone as well as with stickers and animations in the Google camera app.
It is a good thing that Google did the reverse of Apple here. Google started with making extra hardware, then it found a way to cut costs and still have devices with expensive sensors. Now Tango is gone, and ARCore replaces it. It works very well!
Android phones have better cameras
Most companies use sensors, but they do not use a sensor to calculate the distance between the camera and nearby objects.
The following is an excerpt from a press release: Instead, improvements in on-device machine learning have made some of the best Android phones have cameras that are better than what Apple could offer.
Apple does everything very slowly. The idea to use LiDAR to improve Apple’s AR platform or incorporate it into the camera wasn’t something the company thought of over tacos last week. After a lot of time and money, Apple decided that the inclusion of LiDAR could help turn the iPhone camera from “one of the best” to “one
Do the sensors need to come back to Android?
I don’t know the answer to that question. It is a million-dollar question, right? Apple used the standard camera on their new phones and iPads for a while. Now they have a LiDAR sensor package on those devices. Does this mean AR will stop working with older models of iPhones and iPads? Maybe not, but it is
Google may be able to build something bigger and better with LiDAR, but I am thinking they will invent a new invention instead. If you were to ask me what I think the future holds for a tablet with a LiDAR sensor package in it, I would say that they will tie it to other information. That’s what Li
Augmented Reality (AR) is fun to use, but it also helps. Google used Tango to build an indoor mapping system that gave audible cues to people with low vision so they knew where to step safely. With a precise location system, the map only needs to be “drawn” once, and then real-time checks need only
Most people won’t use a tablet inside, but the idea that the world can be built in the cloud with a LiDAR sensor could lead to other applications. Imagine seeing someone collect Pokémon on your screen while they’re playing Pokemon Go on theirs. Or an application that could act as a virtual tour guide because it knows where you are and
The Floor is Lava seems like a cool game, but without LiDAR it could have been a little better. And camera quality was really good before the addition of LiDAR, but adding it will make the photos better in low light. Apple could have done the same thing using machine learning just like Google or Huawei or Samsung did
Google always wants to make Android better. They use software to do this. Google does not think that adding LiDAR technology will make Android better and phone makers cannot do it without Google’s help. Adding LiDAR could make your phone smarter and more aware of where it is and what you are doing, but I don’t think that
Google has always been about making new phones that have the newest apps. The best way to do this is through software. They have been doing a great job so far and I think they will keep on doing it. If they make a new sensor, it will be a type of proof-of-concept like Face Unlock was for the latest