As we have covered previously, one of the standout features of the new iPad Pro series is its LiDAR sensor. While we opined that LiDAR has the potential to offer greater accuracy than contemporary 3D ToF cameras do, iFixit has clarified the levels of depth-sensing that the iPad Pro offers.
As the video below demonstrates, LiDAR operates like the TrueDepth camera that Apple uses for FaceID. Simply, both sensors bounce infrared dots off objects to calculate depth of field information. However, the LiDAR sensor in the iPad Pro does not use as many projections to calculate this information as the TrueDepth camera does. Fewer data points correlate with less detailed depth of field information, meaning that the iPad Pro cannot match recent iPhones in this regard.
With that said, the LiDAR and TrueDepth sensors have different uses. LiDAR does not necessarily need to be as accurate as the TrueDepth camera or FaceID because the iPad Pro uses it for AR applications, not identifying faces. While LiDAR may well offer greater accuracy than conventional 3D ToF cameras do, it will not take the iPad Pro's AR capabilities beyond those of recent iPhones, especially if you are a fan of Animojis or Memojis.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News Writer (Romania based)
Details here