Unreal’s new iPhone app does live motion capture with Face ID sensors


Unreal Engine developer Epic Games has released Live Link Face, an iPhone app that uses the front-facing 3D sensors in the phone to do live motion capture for facial animations in 3D projects like video games, animations, or films.

The app uses tools from Apple’s ARKit framework and the iPhone’s TrueDepth sensor array to stream live motion capture from an actor looking at the phone to 3D characters in Unreal Engine running on a nearby workstation. It captures facial expressions as well as head and neck rotation.

Live Link Face can stream to multiple machines at once, and “robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture,” according to Epic’s blog post announcing the app. Users get a CSV of raw blendshape data and an MOV from the phone’s front-facing video camera, with timecodes.

To many, the iPhone’s TrueDepth sensor array seemed like a solution in search of a problem—Touch ID worked just fine for most peoples’ purposes in earlier iPhones, and there weren’t many clear applications of the technology besides Face ID. Sure, there were Animojis, and a lot of people enjoy them—but most wouldn’t say they single-handedly justify the technology.

But here’s a clear selling point, albeit for a narrow audience: indie game developers and filmmakers can use apps like this on the iPhone for motion capture, bringing virtual characters to life with the facial expressions and other motions of real human actors. That’s something that traditionally involved expensively equipped studios that are out of reach for all but the big players.

The motion capture the iPhone’s sensors can manage is not as accurate as what’s used by triple-A game development studios or major motion pictures, of course, so those high-end studios won’t be dropping their kitted-out mo-cap studios for iPhones right now. But now, individual creators and smaller teams can do something that was cost-prohibitive before.

That’s not really a consumer use case, per se, though. So far, these sensors have mainly been used by Apple and third-party app developers to make AR apps and photo filters more accurate.

Apple recently added a lidar sensor to the iPad Pro, and rumors abound on the Internet that it will make its way to top iPhone models, too. While the iPad Pro’s rear sensors use a different technology than the front-facing TrueDepth array, it can generate a similar result. Apple has already demonstrated the iPad Pro’s rear sensors recording motion capture from a person—albeit the person’s whole body, not their face.

For those not familiar, Unreal Engine began life as a graphics engine used by triple-A video game studios for titles like Gears of War and Mass Effect, and it has evolved over time to be used by indies and in other situations like filmmaking, architecture, and design. It competes with another popular engine called Unity, as well as in-house tools developed by various studios.

Listing image by Epic Games



Source link