BEBYFACE PART 5 - Testing Realtime Animation Production!

This episode is all about an exciting announcement surrounding Bebylon and Project Bebyface! It also doubles as the latest Bebyface test focused on realtime animation production using the iPhoneX & Xsens suit while capturing, editing & rendering all within the Unreal Engine!! It makes production ridiculously fast & fun!

My goal was to spend 4 days and sketch out a narrative type piece from start to finish, testing the realtime pipeline in a more production scenario. I basically wanted to see what unexpected pitfalls I would run into.

iPhoneX (face), XSens (Body), Unreal Engine for everything else (realtime capture, edit, cameras, lighting, rendering, etc...)

You'll have to forgive my ghetto acting and the long test. It could use some serious editing but testing time ran out ; P

Take Aways...

- Its cool how much content a single person can make & how fast.

- The iPhoneX face capture continues to hold up pretty well (High five to the face shift guys!)

- The face capture expressivity can still be improved quite a bit

- Realtime rendering is mega awesome and a huge net win despite a few issues *at least with our project

- Still lots of pain points within the workflow but should get worked out over time.

- Lots of fun still to come!

Related News...

Joe Graf from Epic has been making some really nice improvements to the ARkit integration coming out in 4.22, which I'm excited

Context:

These are a series of test I'm doing to find a fast pipeline for performance capture (Face and body) that we can use for our up and coming game Bebylon: Battle Royale. #animation #unrealengine #3d #ue4 #iphone #bebylon #gamedev #arkit #apple

Recommended Reading >> bit.ly/32kRpzw

Comments