Unreal Engine has released a new iOS app that makes high-quality 3D motion capture far more accessible. Dubbed Live Link Face, the app utilizes the iPhone’s Face ID TrueDepth Camera and ARKit to capture facial data in real time. That information can be used to animate a digital avatar during a live stream, or saved for later use with professional post-production software.
“The Live Link Face app harnesses the facial capture quality of iPhone ARKit and turns it into a streamlined production tool,” said Verizon Director of Animation Technology Addy Ghani. “This solution is perfect for a creator at home or a professional studio team like ours.”
The amount of data the app will capture depends on how the iPhone is positioned. If it is placed on a desk, it will record facial expressions and head and neck movements. If it is mounted to the user’s head, it will only pick up facial expressions. Live Link Face uses the iPhone’s motion coprocessors to determine whether or not it needs to make an adjustment.
The data includes timestamps to make sure that it can be synced with other recordings when imported into another editing tool. Developers will need to use production software that is compatible with the Unreal Engine in order to take advantage of saved data. The Unreal Engine is primarily used in game development, although it has also been used for digital effects in live action productions like The Mandalorian.
Live Link Face is available for free and does not feature any in-app purchases. Apple, meanwhile introduced lidar support with the latest version of the ARKit, which was released in March. Face ID has been a standard feature of the iPhone for the past few years, and is expected to once again make an appearance in the upcoming iPhone 12.
Source: Apple Insider
(Originally posted on FindBiometrics)