Houdini 20.5 Copernicus

How to use ONNX Inference

Describes how to apply inference using a model in the ONNX Inference node.

On this page

Overview

The ONNX Inference COP lets you perform inference using a pre-trained model on the node’s inputs to evaluate and then generate the outputs. The model also has inputs and outputs, which are known as tensors (multi-dimensional data). The following workflows outline how to use the ONNX Inference node to apply inference with one or more tensors.

Note

Model refers to the ONNX file and its inputs and outputs.

For more details about the node and its parameters, see ONNX Inference.

How to apply inference with one tensor

  1. Download an ONNX model and save it in your $HIP directory. This example uses the Mosiac model.

  2. Create an ONNX Inference COP in your Copernicus network.

  3. Set the Model File parameter to the path of your ONNX model file (mosaic-9.onnx in this example).

  4. Click Setup Shapes from Model. This populates most parameters in the node, such as the Name, Data and Tensor Shapes of the Model tab.

  5. Add a File COP in the network.

  6. In the File COP, set Channel Type to the input type that the model expects (RGB in this example).

  7. Wire the File COP into the ONNX Inference COP.

    Note

    A warning appears if the input image’s size doesn’t match the model’s expected input size. You can resolve this issue in the ONNX Inference COP. Turn on Resample Size in the Input & Output tab and set it to the model’s expected size for an input image (224, 224 in this example).

  8. (Optional) If your model output is too bright, turn on the output’s Brightness Multiplier in the Input & Output tab and set it to 1/255.

How to apply inference with multiple tensors

  1. Download an ONNX model and save it in your $HIP directory. This example uses the Mosiac model.

  2. Create an ONNX Inference COP in your Copernicus network.

  3. Set the Model File parameter to the path of your ONNX model file (mosaic-9.onnx in this example).

  4. Click Setup Shapes from Model. This populates most parameters in the node, such as the Name, Data and Tensor Shapes of the Model tab.

  5. Add a File COP in the network.

  6. In the File COP, set Channel Type to the input type that the model expects (RGB in this example).

  7. Add a Channel Split COP in the network.

  8. Wire the File COP into the Channel Split COP.

  9. In the ONNX Inference COP’s Input & Output tab, make sure there’s an input for each channel using the Number of Inputs parameter (three in this example).

  10. Set the Type for each input in the Input & Output tab to the data type the channels require (Mono in this example).

  11. Set the Name parameters for each input. In this example, set the following:

    • first input to inputR

    • second input to inputG

    • third input to inputB

  12. Wire the Channel Split COP’s outputs into the ONNX Inference COP’s inputs. In this example:

    • red into inputR

    • green into inputG

    • blue into inputB

    Note

    A warning appears if the input image’s size doesn’t match the model’s expected input size. You can resolve this issue in the ONNX Inference COP. Turn on Resample Size for each input in the Input & Output tab and set them to the model’s expected size for an input image (224, 224 in this example).

  13. In the ONNX Inference COP’s Model tab, set Data to the names of all inputs in the Input & Output tab (inputR inputG inputB in this example).

  14. (Optional) If your model output is too bright, turn on the output’s Brightness Multiplier in the Input & Output tab and set it to 1/255.

Copernicus

Basics

Next steps

  • Working with Copernicus nodes

    Provides next steps and workflows for how to use Copernicus nodes.

  • Slap comp

    Slap composite (slap comp) is a fast image manipulation you can use to view approximate and live results of a final composite.

  • OpenFX

    Describes what OpenFX is and how to use it in your network.

  • Hatching

    Describes how to use hatching in your Copernicus network.

  • How to use ONNX Inference

    Describes how to apply inference using a model in the ONNX Inference node.

  • Copernicus tips

    Useful tips and information while using COPs.

Advanced concepts

  • Normals

    Defines the normals that the Copernicus network uses.

  • Spaces

    Defines the spaces that the Copernicus network uses.