Nick Petit

npetit

About Me

Connect

LOCATION
Not Specified
WEBSITE

Houdini Skills

Availability

Not Specified

My Talks

obj-image Houdini HIVE
New RBD Workflow Tools in Houdini 18

Recent Forum Posts

Solaris RBD production workflow May 14, 2024, 5:02 p.m.

As long as the name attribute is derived from an existing prim on the stage, that prim will be replaced by whatever geo you give it (or it'll have its geometry attribs muted and child mesh prims will be created underneath it when choosing the animated xforms option) - so as you have discovered, you can happily abuse that to replace any geo on the stage with whatever you want.

As for pushing back the sim onto existing pre-fractured geo on the stage, depending on how the prefractured geo lives on the stage, here are a couple of options.

Solaris RBD production workflow May 14, 2024, 1:34 a.m.

It really depends on what it is you're trying to do. The RBD Destruction LOP is designed as a template for injesting data from the stage and destroying it, with special care taken to place everything back in the same hierarchy to minimise primvar and shader work after the sim - if you keep everything in the same hierarchy, primvars and shader bindings are inherited.
You don't have to do the sim in there however - object merge the geo coming out of the first input in the embedded editable subnet and fracture it in regular SOPs, sim it etc... You can then object merge that simmed geo back into the editable subnet and pipe into into the appropriate output node. You may want to object merge your simmed points too if you plan on using the procedural.
The reason you may want to import the geo from the RBD Destruction LOP is that is prepares the name for you to make it easier to track where each piece of geo originally came from. From there, any fracturing method you use which simply appends to the existing name attrib will work fine.


On the other hand, if you're adding new destruction elements to the stage, things can be a lot simpler since you don't have to be so careful with prim paths and their final destination.


A few things to keep in mind however performance-wise.
  1. Point Instancers bring little to no benefit when adding fractured pieces. You end up creating just as many prototype prims as you have pieces and you may run into precision loss issues as PI transforms are authored as floats and not doubles. PIs only really come in handy when instancing debris for example.

  2. Bringing in fractured pieces as unique prims - this has the advantage the transforms can be authored cheaply as transforms on the geometry's (parent) prim, saving on disk space requirements. However, having many thousands of prims on the stage in the same hierarchy will hurt the stage's performance. (transform blur)

  3. Bringing in the fractured geometry as single prims with deforming points can minimise the impact on the stage, however disk space consumption will be negatively impacted. (deformation blur)

  4. Bringing in the fractured geometry as single prims and using USDSkel to deform that geometry is one way of minimising disk space and impact on the stage, however it can be tricky creating these, bindings aren't trivial and vertex normals aren't currently supported.

  5. Bringing in the fractured geometry as single prims and using a procedural to deform the pieces at render time is the most efficient way of handling large quantities of fractured RBDs, especially if reading the point caches off disk as bgeo caches.

For 1) there are several ways of doing this, including using the Transform by SOP Points LOP, or importing the pieces as static geo first and merging in their transforms.

Here's an example hip file with a few various examples of how to bring RBDs into LOPs.

In the next version of Houdini, the RBD Procedural LOP has been broken out into its own node, so you can use it without having to go through the RBD Destruction LOP.

Motion vector AOV in Karma April 26, 2024, 10:54 p.m.

On the Karma Render Settings LOP node, if you turn off Rendering > Camera Effects > Disable Image Blur do you get the expected motion blur?
Are you rendering with XPU or CPU? Only CPU has motion vectors supported at the moment.

Here's a hipfile that shows it working.