Ids Boonstra
Ids
About Me
Technical Animator at Enliven
専門知識
Developer
業界:
Gamedev
Houdini Engine
Availability
Not Specified
Recent Forum Posts
Matching IK/FK on KineFx 2022年2月22日17:15
return _hou.HDAModule___getattr__(self, name)First off, this error seems to indicate that your button can't find the specified function.
AttributeError: 'module' object has no attribute 'matchIK'
In the callback script of the button, you can use
hou.phm()
hou.phm().matchIK()
Second, you don't need to use f"{hou.pwd().path()}. The HDA is treated as the parent node of all nodes inside your HDA. That means you can use a simple
hou.node()
compute = hou.node("computerigpose1")
compute = hou.node("./computerigpose1")
node = hou.node("subnet1/node1")
Low cost facial mocap pipeline? 2022年1月11日4:03
You can't export the blended mesh because the Character Blendshapes node removes the relevant data. Instead, you need to feed the animated skeleton with the blendshape channels right into the FBX Character Output. The character's mesh needs to have the blendshapes as a hidden group of primitives and that goes into the first input (mesh) of the FBX Character Output. Houdini 19 offers a more streamlined workflow for this.
Question 1:
Unreal supports both joints and blendshapes, so it is up to you to decide what combination you want to use. Joints typically require a bit more setup to create facial animation than blendshapes, so you should ask your riggers/animators what they prefer.
Question 2:
You can buy one of the latest iPhones to use Epic's free Unreal Live Face app, which allows you to stream basic facial animation into Unreal. This app uses Apple's predefined facial blendshapes, so the blendshape animations can be applied to any compatible character (such as from Reallusion's Character Creator).
Other than that, you could opt for the quite expensive Faceware software called Studio, which basically does the same thing as Unreal's Live Face App, in combination with their hardware setup. This can be streamed into Unreal as well.
The lips are tracked by the software. The teeth shouldn't move because they are attached to bones. The tongue is usually only tracked when it is pushed out. If you want accurate tongue movement, an animator will probably have to manually add it, unless you decide to use lipsync software.
Question 1:
Unreal supports both joints and blendshapes, so it is up to you to decide what combination you want to use. Joints typically require a bit more setup to create facial animation than blendshapes, so you should ask your riggers/animators what they prefer.
Question 2:
You can buy one of the latest iPhones to use Epic's free Unreal Live Face app, which allows you to stream basic facial animation into Unreal. This app uses Apple's predefined facial blendshapes, so the blendshape animations can be applied to any compatible character (such as from Reallusion's Character Creator).
Other than that, you could opt for the quite expensive Faceware software called Studio, which basically does the same thing as Unreal's Live Face App, in combination with their hardware setup. This can be streamed into Unreal as well.
The lips are tracked by the software. The teeth shouldn't move because they are attached to bones. The tongue is usually only tracked when it is pushed out. If you want accurate tongue movement, an animator will probably have to manually add it, unless you decide to use lipsync software.
Integrating Pycharm with houdini tutorial? 2021年10月18日4:05
Here is how I got the PyCharm integration working, including autocomplete:
From there on, you can use the hou module the same way you would inside Houdini, for example:or
I hope this helps! I couldn't attach pictures because that would be quite a few pictures, so I hope that my instructions are clear enough to follow. If you need any further help, feel free to reach out to me.
- Go to your Python Interpreter settings (in the bottom right, or File>Settings>Project>Python Interpreter).
- Click the gear icon in the upper right corner to show all your interpreters.
- In the top left corner of the new window, click the "+" button to add a new interpreter.
- Pick the System Interpreter and then browse to the path of your Houdini installation that has the Python executable that you want to use (2.7 vs 3.7 for example). The path probably looks something like this: C:\Program Files\Side Effects Software\Houdini 18.x.xxx\python27\python2.7.exe.
- After you've pressed OK in the window, you will go back to the window with the list of interpreters. Select your new interpreter and click the 5th button from the top left to "Show paths for the selected interpreter".
- By default it should have some paths to libraries already. However, the specific hython libs are probably not in the list, so you need to add that by pressing the "+" in the top left corner. The path should look something like this: C:\Program Files\Side Effects Software\Houdini 18.x.xxx\houdini\python2.7libs (or 3.7libs if you want to use 3.7). These libs will give you the autocomplete functionality for the HOM classes.
- Last but not least, for each Houdini Python project you need to set the interpreter to this new interpreter. And in order to get the autocomplete working properly, you need to import the hou module at the start of each file where you want to use it. It is as simple as import hou, and PyCharm should recognize the module from the interpreter.
From there on, you can use the hou module the same way you would inside Houdini, for example:
import hou hou.node("../..")
hou.pwd().geometry()
I hope this helps! I couldn't attach pictures because that would be quite a few pictures, so I hope that my instructions are clear enough to follow. If you need any further help, feel free to reach out to me.