Unreal’s new iPhone app does live motion capture with Face ID sensors

Joined
Mar 30, 2005
Messages
9,571
Reaction score
25
Points
48
  • UnrealFaceID-1-980x551.jpg


    A workstation running Unreal Engine with an iPhone for motion capture. [credit: Epic Games ]

Unreal Engine developer Epic Games has released Live Link Face, an iPhone app that uses the front-facing 3D sensors in the phone to do live motion capture for facial animations in 3D projects like video games, animations, or films.

The app uses tools from Apple's ARKit framework and the iPhone's TrueDepth sensor array to stream live motion capture from an actor looking at the phone to 3D characters in Unreal Engine running on a nearby workstation. It captures facial expressions as well as head and neck rotation.

Live Link Face can stream to multiple machines at once, and "robust timecode support and precise frame accuracy enable seamless synchronization with other stage components like cameras and body motion capture," according to Epic's blog post announcing the app. Users get a CSV of raw blendshape data and an MOV from the phone's front-facing video camera, with timecodes.


Read 6 remaining paragraphs | Comments

Click here to view the article...
 

Shop Amazon


Shop for your Apple, Mac, iPhone and other computer products on Amazon.
We are a participant in the Amazon Services LLC Associates Program, an affiliate program designed to provide a means for us to earn fees by linking to Amazon and affiliated sites.
Top