Hi How add mocapstream LiveLinkFace animation to APEX? I recording my face and sending in Houdini (wifi connect). How send Blenshapes data to APEX rig ?
Help me please.)
LiveLinkFace to APEX
969 4 1- michail kuznetsov
- Member
- 3 posts
- Joined: 1月 2015
- Offline
- edward
- Member
- 7871 posts
- Joined: 7月 2005
- Offline
- Eche
- Member
- 42 posts
- Joined: 2月 2018
- Offline
I have a similar question, how do I drive blendshapes with data in Apex?
I want to animate facial blend shapes with data coming from livelink face .csv files.
I have this working fine in kinefx: a character with facial blendshapes, a python node that reads the csv, generates detail attributes for each channel, then drives the facial blend shape detail channels with this data. This then goes to a character blendshape sop to apply the animation.
So, I understand how to do skeleton motion capture retargeting in kinefx and in Apex, and I understand how to do facial blendshape retargeting In kinefx, but not in Apex. Essentially I'm looking for the equivalent of the Apex motion capture retargeting workflow of driving the skeleton with data, but driving blendshape channels with data.
I want to animate facial blend shapes with data coming from livelink face .csv files.
I have this working fine in kinefx: a character with facial blendshapes, a python node that reads the csv, generates detail attributes for each channel, then drives the facial blend shape detail channels with this data. This then goes to a character blendshape sop to apply the animation.
So, I understand how to do skeleton motion capture retargeting in kinefx and in Apex, and I understand how to do facial blendshape retargeting In kinefx, but not in Apex. Essentially I'm looking for the equivalent of the Apex motion capture retargeting workflow of driving the skeleton with data, but driving blendshape channels with data.
- PHENOMDESIGN
- Member
- 172 posts
- Joined: 5月 2021
- Offline
Here is a APEX Blendshape content demo from the SideFX Library. It might help in this situation:
https://www.sidefx.com/contentlibrary/apex-pillow/ [www.sidefx.com]
https://www.sidefx.com/contentlibrary/apex-pillow/ [www.sidefx.com]
PHENOM(enological) DESIGN;
Experimental phenomenology (study of experience) is a category of philosophy evidencing intentional variations of subjective human experiencing where both the independent and dependent variable are phenomenological. Lundh 2020
Experimental phenomenology (study of experience) is a category of philosophy evidencing intentional variations of subjective human experiencing where both the independent and dependent variable are phenomenological. Lundh 2020
- edward
- Member
- 7871 posts
- Joined: 7月 2005
- Offline
Eche
I have this working fine in kinefx: a character with facial blendshapes, a python node that reads the csv, generates detail attributes for each channel, then drives the facial blend shape detail channels with this data. This then goes to a character blendshape sop to apply the animation.
To add to this, https://www.sidefx.com/contentlibrary/kinefxapex-workflow/ [www.sidefx.com] has the example on how to take animation from the MotionClip world into channel primitives to drive APEX rigs.
-
- Quick Links