iPhone X Facial Expression Capture test PART 2 - Data into Maya

Test #2. A new test where I record the facial animation data from the iPhone X and import it into maya to animate the same character from our game Bebylon. This is shown in the middle of the vid.

A couple other additions: now with eye tracking, better blendshapes (can still get much better) and added a few other WIP Bebylon Characters just for fun! THOUGHTS

Im pretty confident this beby character could be improved dramatically by dialing in the blendshape sculpts (fixing most of the mouth issues) and also adding proper wrinkle maps that deform the skin as the face animates. As well, using the captured data to drive secondary blendshapes will help the expressions feel more dynamic and alive.

We’re still deep in the uncanny valley here but hey, for damn cheap you can capture some pretty decent data to use as a base. Not only cheap but its mobile and super fast and easy to capture, convert and import. With a Shure MV88 mic attached to your iPhoneX you’d have a pretty killer all in one mobile facial expression capture system!

The downside is there is a limit to the expressivity that you can capture and if you need to go beyond that you would need to animate that by hand.

NEXT TESTS

The next test is dawning our XSENS motion capture suit and strapping this iPhoneX to a helmet to see if i can capture full body and face at the same time. I still want to improve the blendshapes and really see how good i can get this character to look, though I doubt I'll have the time.

#animation #unrealengine #motioncapture #iphone #xsens #bebylon #gamedev #arkit #ue4 #apple #realtime

Recommended Reading >> bit.ly/32kRpzw

Comments