So as indie game developer, we cannot afford a full-scale facial mocap team. But we need to make somewhat "can-do" level facial animation. I found that kineFX is great in terms of bone animation but I didn't find useful resources for character blend shape. I don't know if anything has changed after 18.5
Q:
The first question is about the methodology to drive the facial animation:Should we use a complicated tree-like headbone group(that drives every muscule on the face with a bone) to drive the face mesh, or should we just use morpher? It may have an impact if we want to do that in Unreal. In 18.5, we still cannot use blendshape to directly output morphers into unreal.
The second question is about morphing. Is there any cheap-method of getting facial rig data? I mean is there something like camera-face-recognition to face mesh morph? And in this way, how do we control the mouth/tongue/teeth?
Low cost facial mocap pipeline?
2106 1 3- goose7
- Member
- 75 posts
- Joined: 9月 2018
- Offline
- Ids
- Member
- 7 posts
- Joined: 2月 2017
- Offline
You can't export the blended mesh because the Character Blendshapes node removes the relevant data. Instead, you need to feed the animated skeleton with the blendshape channels right into the FBX Character Output. The character's mesh needs to have the blendshapes as a hidden group of primitives and that goes into the first input (mesh) of the FBX Character Output. Houdini 19 offers a more streamlined workflow for this.
Question 1:
Unreal supports both joints and blendshapes, so it is up to you to decide what combination you want to use. Joints typically require a bit more setup to create facial animation than blendshapes, so you should ask your riggers/animators what they prefer.
Question 2:
You can buy one of the latest iPhones to use Epic's free Unreal Live Face app, which allows you to stream basic facial animation into Unreal. This app uses Apple's predefined facial blendshapes, so the blendshape animations can be applied to any compatible character (such as from Reallusion's Character Creator).
Other than that, you could opt for the quite expensive Faceware software called Studio, which basically does the same thing as Unreal's Live Face App, in combination with their hardware setup. This can be streamed into Unreal as well.
The lips are tracked by the software. The teeth shouldn't move because they are attached to bones. The tongue is usually only tracked when it is pushed out. If you want accurate tongue movement, an animator will probably have to manually add it, unless you decide to use lipsync software.
Question 1:
Unreal supports both joints and blendshapes, so it is up to you to decide what combination you want to use. Joints typically require a bit more setup to create facial animation than blendshapes, so you should ask your riggers/animators what they prefer.
Question 2:
You can buy one of the latest iPhones to use Epic's free Unreal Live Face app, which allows you to stream basic facial animation into Unreal. This app uses Apple's predefined facial blendshapes, so the blendshape animations can be applied to any compatible character (such as from Reallusion's Character Creator).
Other than that, you could opt for the quite expensive Faceware software called Studio, which basically does the same thing as Unreal's Live Face App, in combination with their hardware setup. This can be streamed into Unreal as well.
The lips are tracked by the software. The teeth shouldn't move because they are attached to bones. The tongue is usually only tracked when it is pushed out. If you want accurate tongue movement, an animator will probably have to manually add it, unless you decide to use lipsync software.
Technical Animator (using KineFX) at Enliven | Social Enterprise
idsboonstra.artstation.com
idsboonstra.artstation.com
-
- Quick Links