Houdini 20.5 Copernicus

Hatching

Describes how to use hatching in your Copernicus network.

On this page

Overview

You can use Copernicus nodes with Content Library HDAs to add Hatching in your scene. Use hatching in your Copernicus scene to add a drawing effect. Hatching impacts the light gradient and density of your shading.

See the Content Library to access and download HDAs for hatching. The project file includes the following built-in HDAs:

  • Hatching

    The Hatching HDA merges hatching layers to apply a crosshatching effect.

  • Hatch Tile

    With the Hatch Tile HDA, you can apply horizontal, vertical, and cross hatching. You can then control the density, scale, and length of the hatches. The Hex Tile COP in the HDA node’s subnetwork controls the Hatches Tiling and Tiling Blend parameters. You can also point the Hatching Shape parameter to a SOP, which applies a geometry shape to your Hatch Tile.

  • Tangent

    The Tangent HDA computes gradients, impacting how the light changes and distributes across the scene.

The following are examples of different hatching types you can apply:

  • Crosshatching

  • Circulism

  • Contouring

  • Scribbling

  • Stippling

  • Blending

See the Content Library for more complex hatching workflows, such as using contour lines and color.

Adding a Hatch Tile

Follow these steps to add a Hatch Tile HDA node in your scene.

  1. Create a File COP in your COP network to import an image or video.

  2. Add a Hatch Tile HDA node and configure the hatching type.

  3. Wire the File COP’s C output into the Hatch Tile node’s camera_ref input.

  4. (Optional) Add more Hatch Tile nodes and wire the File COP’s C output into each node’s camera_ref input.

Blending Hatch Tiles

Follow these steps to blend multiple Hatch Tile HDA nodes.

  1. After you add your Hatch Tiles, add a Constant COP, Ramp COP, and Sequence Blend COP.

  2. Wire the Constant COP into each Hatch Tile node’s direction input.

  3. Wire the File COP’s C output into the Ramp COP’s size_ref input.

  4. Wire the Ramp COP into the Sequence Blend COP’s blend input.

  5. Wire the Hatch Tile nodes into the Sequence Blend COP’s image inputs. There should be an image input for each Hatch Tile node.

Applying uniform direction hatching

Follow these steps to blend your hatches by direct diffuse and add uniform direction hatching. This means all the hatches go in the same direction.

  1. Create a File COP in your COP network to import an image or video.

  2. Add the amount of Hatch Tile HDA nodes that you want to blend, and configure the hatching type for each.

  3. Add a Mono COP.

  4. Wire the File COP’s C output into each Hatch Tile node’s camera_ref input and the Mono COP’s source input.

  5. Add a Channel Join COP and set Signature to UV.

  6. Wire the Mono COP into the Channel Join COP’s red and green inputs.

  7. Add a Constant COP. This COP’s values determine the hatching direction.

  8. Wire the Channel Join COP into the Constant COP’s source input.

  9. Wire the Constant COP into each Hatch Tile node’s direction input.

  10. Add a Sequence Blend COP.

  11. Wire the Hatch Tile nodes into the Sequence Blend COP’s image inputs. There should be an image input for each Hatch Tile node.

  12. Add another Mono COP and rename it to Direct Diffuse.

  13. Wire the File COP’s directdiffuse AOV output into the Direct Diffuse node.

  14. Add a Remap COP.

  15. Wire the Direct Diffuse node into the Remap COP’s source input.

  16. Wire the Remap COP into the Sequence Blend COP’s blend input. Turn on the Sequence Blend COP’s display flag to see the blended hatches.

Applying non-uniform direction hatching

Follow these steps to add non-uniform direction hatching, which lets you control how the hatches follow the lighting gradient (see the following image example). This workflow reduces the resolution, computes the gradients, and then increases the gradients of the image.

  1. Create a File COP in your COP network to import an image or video.

  2. Add a Hatch Tile HDA node and configure the hatching type.

  3. Wire the File COP’s C output into the Hatch Tile node’s camera_ref input.

  4. Add a Null COP.

  5. Wire the File COP’s directdiffuse AOV output into the Null COP.

  6. Add a Denoise AI COP. This decreases the difference between neighbors' pixels (noise).

  7. Wire the Null COP into the Denoise AI COP’s source input.

  8. Add a Mono COP and rename it to Direct Diffuse.

  9. Wire the Denoise AI COP into the Direct Diffuse node.

  10. Add a Tangent HDA node. Set the Pixel Scale parameter to 2 to further reduce the noise.

    Note

    If you set the Tangent node’s Pixel Scale parameter to 1, the viewport displays the regional image.

  11. Wire the Direct Diffuse node into the Tangent node.

  12. Wire the Tangent node into the Hatch Tile node’s direction input. Turn on the Hatch Tile node’s display flag to see the hatches distribute around the lighting.

Combining direction hatching

Follow these steps to combine uniform and non-uniform direction hatching. For example, you can add uniform direction hatching to the background and non-uniform direction hatching to characters in your scene.

  1. Create a File COP in your COP network to import an image or video.

  2. Add the amount of Hatch Tile HDA nodes that you want to blend, and configure the hatching type for each.

  3. Add a Mono COP and rename it to Direct Diffuse.

  4. Wire the File COP’s directdiffuse AOV output into the Direct Diffuse node.

  5. Wire the Direct Diffuse node into each Hatch Tile node’s camera_ref input.

  6. Add a Sequence Blend COP.

  7. Wire the Hatch Tile nodes into the Sequence Blend COP’s image inputs. There should be an image input for each Hatch Tile node.

  8. Add a Remap COP.

  9. Wire the Direct Diffuse node into the Remap COP’s source input.

  10. Wire the Remap COP into the Sequence Blend COP’s blend input.

  11. Add another Mono COP and rename it to Direct Diffuse.

  12. Wire the File COP’s directdiffuse AOV output into the Direct Diffuse node.

  13. Add a Tangent HDA node. Set the Pixel Scale parameter to 2 to further reduce the noise.

  14. Wire the Direct Diffuse node into the Tangent node.

  15. Add a Null COP and rename it to LIGHTING_DIRECTION.

  16. Wire the Tangent node into the LIGHTING_DIRECTION Null COP.

  17. Add a Constant COP and two Ramp COPs.

  18. Wire the Constant COP into each Ramp COP’s size_ref input.

  19. Add two more Tangent HDA nodes.

  20. Wire each Ramp COP into one Tangent node. The Ramp COPs should wire into different Tangent nodes.

  21. Add a Blend COP.

  22. Wire one Tangent node into the Blend COP’s bg input, and the other Tangent node into the fg input.

  23. Add another Null COP and rename it to STATIC_DIRECTION.

  24. Wire the Blend COP into the STATIC_DIRECTION Null COP.

  25. Wire the File COP’s CryptoMaterial AOV output into a mask that removes the objects to which you’ll apply non-uniform direction hatching.

  26. Add another Blend COP.

  27. Wire the LIGHTING_DIRECTION Null COP into the Blend COP’s bg input.

  28. Wire the STATIC_DIRECTION Null COP into the Blend COP’s fg input.

  29. Wire your mask into the Blend COP’s mask input.

  30. Wire the Blend COP into each Hatch Tile node’s direction input. Turn on the Sequence Blend COP’s display flag to see the output with uniform and non-uniform direction hatching applied.

Notes

  • Since the Hatch Tile node uses a Hex Tile COP in the subnetwork, you must balance the scale of the hatches (Hatches Length) and the tiles (Tile Size). For example, an unbalanced scale may display seams between hex tiles. Use Weight Exp to balance the hatches and tiles.

  • The directdiffuse AOV is channel data taken from rendering that represents direct lighting. You can use this data to compute tangent vectors that control the direction of hatching. You can also use it to create more complex lighting gradients, such as adding reflections. If you map your Hatch Tiles with a directdiffuse AOV, make sure the direct diffuse isn’t a texture to avoid changing the hatches by texture.

  • With animation, using direction hatching can produce a noisy swimming effect. It’s recommended that you use non-uniform direction hatching on objects that require a lighting gradient. For other areas in your scene you should use geometry or uniform direction hatching.

Copernicus

Basics

Next steps

  • Working with Copernicus nodes

    Provides next steps and workflows for how to use Copernicus nodes.

  • Slap comp

    Slap composite (slap comp) is a fast image manipulation you can use to view approximate and live results of a final composite.

  • OpenFX

    Describes what OpenFX is and how to use it in your network.

  • Hatching

    Describes how to use hatching in your Copernicus network.

  • How to use ONNX Inference

    Describes how to apply inference using a model in the ONNX Inference node.

  • Copernicus tips

    Useful tips and information while using COPs.

Advanced concepts

  • Normals

    Defines the normals that the Copernicus network uses.

  • Spaces

    Defines the spaces that the Copernicus network uses.