Apple has released an updated version of ARKit, its augmented reality (AR) development tool, to now support the new lidar sensor found in the upcoming 2020 iPad Pro.
ARKit 3.5 features a new Scene Geometry API that uses lidar’s capabilities — the measuring of distances using pulsed laser light — to create a 3D map of space that can differentiate between floors, walls, ceilings, windows, doors, and seats.
Within a five-meter distance, the lidar scanner can measure the lengths, widths and depths of objects, effectively creating a map of that space that is accurate and to scale, and that can be used for object occlusion — the process of placing digital objects into a real scene — after the device’s camera has been waved over the space.
The lidar sensors and ARKit’s update also allow for improved motion capture and people occlusion, better estimating height and depth, as lidar is more accurate — in that it can provide 3-axis measurements — than the previously relied-upon 2D cameras that were used in the iPhone or iPad.
ARKit version 3.5 adds to last year’s AR scene composition tools gained in ARKit 3, released at Apple’s Worldwide Developer’s Conference (WWDC) 2019, and is available now for registered developers as a part of Xcode 11.4.
Source: VentureBeat
Follow Us