iPhone X Facial Expression Capture test PART 1

Test #1. A rough facial capture test using the iPhone X's front facing depth camera. I took the WIP Beby character/rig I've been working on for our VR game Bebylon and drove some of its blendshapes using the iPhones tracking data.

Sorry for long ass video, apparently I couldn't stop.

- Cory

MORE INFO:

Apple bought Faceshift a while back (facial capture software that tracked using depth data) and essentially made it mobile on the iPhone X and through their ARKit API you can access the 52 different facially tracked motion groups which can drive your own characters blendshapes at 60fps!

I'm interested in if it can be used for cheap & fast facial motion capture for our VR game which is over run by wild and crazy immortal babies who all want to express themselves.

This is a quick first pass test and there's a bit more to be done before I hit the quality ceiling in regards to the captured data. Of course if it were meant to be an AR gag a lot lot more could be done to improve the visual rendering and lighting quality.

Still to try -

- Getting the eyes tracked.

- Re-sculpting some of the blend shapes from the beby rig to be better suited for this setup.

- Tracking performance is much smoother and more precise if you run it without lighting, which you would do if you were really capturing data this way.

- dial in the data ratios coming from the iPhone to better match each muscle groups range. *Character specific visual tuning

- figure out why blinking causes the whole head to move.

- Get Ikrima to write this in native C++ (which will never happen unless we figure out how to freeze time)

- add features to record the data

- make a harness to mount the iPhone into a mocap helmet to record face and body at the same time. *more of this to come as i just busted out our Xsens suit!!

- Get it working in UE4 once they add the ARKit front camera access. *I did this test in Unity thanks to Jimmy and their ARKit team for already having access to the front camera.

Last note* The babies blendshapes & rig were created by and their Rigs on demand service. I couldn't recommend it more. Its a really solid maya based rig and the blend shapes are top notch based on scanned data. The beauty is i uploaded my babies head model on a Thursday night and woke up to a finished rig! I love the times we live in! *More of the Eisko rig to come.

#animation #motioncapture #iphone #xsens #bebylon #gamedev #arkit #apple #realtime

Recommended Reading >> bit.ly/32kRpzw

Comments