Houdini Engine 7.0
 All Classes Files Functions Variables Typedefs Enumerations Enumerator Macros Groups Pages
Fundamentals

Environment Variables

Houdini Engine looks for the following environment variables when initializing:

  • HAPI_LICENSE_MODE:
    • Unset or set to default will have Houdini Engine look first for a Houdini Engine license and if one is not found try to acquire an interactive Houdini or Houdini FX license. This does mean that if the licenses are shared on a common license server that a single individual (the same machine) can acquire both a Houdini FX license and a Houdini Engine license even though Houdini Engine should technically be able to use that same Houdini FX license.
    • Set to engine_only will have Houdini Engine only accept Houdini Engine licenses and will fail the license check if only interactive Houdini or Houdini FX licenses are found.
    • Set to interactive_only will have Houdini Engine only accept interactive Houdini or Houdini FX licenses and will fail the license check if only Houdini Engine licenses are found.
    • Set to houdini_escape_only will have Houdini Engine only accept Houdini Escape licenses and will fail the license check even if other licenses, like Houdini FX or Houdini Engine, are found.
    • Set to houdini_fx_only will have Houdini Engine only accept Houdini FX licenses and will fail the license check even if other licenses, like Houdini Escape or Houdini Engine, are found.
  • HFS: (Houdini File System) Set this to the path to the Houdini install folder. This variable is used to locate the HAPI dlls when calling HAPI_Initialize() or HAPI_CreateInProcessSession().
  • HAPI_CLIENT_NAME: Set to the name of the client that invoked engine: for maya, 3dsmax, unreal and unity. Not set in houdini itself. Your hda can use this variable to check to see if it's been loaded for one of these clients.

Strings

When Houdini Engine wants to return a string to the host application it doesn't do it directly. Instead, it gives the calling function a HAPI_StringHandle to the string it wants to return. The host application can then call HAPI_GetStringBufLength() to get the size of the buffer required to hold the string. After it has allocated the appropriate size buffer, the string can be retrieved with a call to HAPI_GetString().

Return Codes and Error Strings

Every function in Houdini Engine returns a value of HAPI_Result. Any return value other than HAPI_RESULT_SUCCESS indicates some form of failure. You can get the specifics of the problem by first calling HAPI_GetStatusStringBufLength() and then HAPI_GetStatusString().

The status_type argument for both functions should be set to HAPI_STATUS_CALL_RESULT. We can then extract a string with these functions that gives us further information about the problem.

The sample function below will retrieve the status of most calls into the API, with the exception of the asynchronous calls like HAPI_CookNode() and HAPI_CreateNode():

static std::string get_last_error()
{
int buffer_length;
char * buf = new char[ buffer_length ];
std::string result( buf );
delete[] buf;
return result;
}

It is advisable to wrap the function call in a macro such as the following:

#define ENSURE_SUCCESS( result ) \
if ( (result) != HAPI_RESULT_SUCCESS ) \
{ \
cout << "failure at " << __FILE__ << ":" << __LINE__ << endl; \
cout << get_last_error() << endl; \
exit( 1 ); \
}

Cooks and other asynchronous calls use a different set of results, separate from the regular API call results. To get the results of an asynchronous function you need to call HAPI_GetStatusStringBufLength() with status_type set to HAPI_STATUS_COOK_RESULT and an appropriate HAPI_StatusVerbosity.

The sample function below will retrieve the status of asynchronous call:

static std::string get_last_cook_error()
{
int buffer_length;
char * buf = new char[ buffer_length ];
std::string result( buf );
delete[] buf;
return result;
}

Note that in threaded mode it will NOT be sufficient to check the return status of asynchronous functions like HAPI_CookNode() and HAPI_CreateNode(), as these will return immediately with HAPI_RESULT_SUCCESS. Instead, you must rely on the result of HAPI_GetStatus(), with HAPI_STATUS_COOK_STATE as the status_type. See section on Cooking.

Nodes will be parsed to retrieve node errors, warnings and messages ONLY during the call to HAPI_GetStatusStringBufLength() and ONLY for the previously cooked node(s). It's up to you to decide when to call for cook results and with what verbosity. The general rule of thumb is to call it when the cook has failed (HAPI_STATE_READY_WITH_COOK_ERRORS is returned as the READY cook state) or no/invalid geometry has been retrieved. Note that you can get output from Python code within your node using this mechanism. If the Python code raises warnings or errors, the text of the warning/error can be retrieved this way. To raise errors, you can simply throw or call hou.NodeError(), but to raise warnings, you must call hou.NodeWarning()

Initialization and Cleanup

The first thing that needs to be done when using HAPI is to call HAPI_Initialize(). This function resides within libHAPI.dll (as do all other HAPI functions). Along with the rest of the Houdini dlls, libHAPI.dll is located inside:

C:\Program Files\Side Effects Software\Houdini <version>\bin

This bin folder needs to be on your PATH in order for all the dlls to load properly. Your application only needs to link against libHAPI.dll. For the Unity implementation, we figure out where Houdini was installed by version information and the registry, and we augment the process environment just before making the call to:

const HAPI_CookOptions * cook_options,
HAPI_Bool use_cooking_thread,
int cooking_thread_stack_size,
const char * houdini_environment_files,
const char * otl_search_path,
const char * dso_search_path,
const char * image_dso_search_path,
const char * audio_dso_search_path );

You can choose any method that is appropriate for your application.

This is the first time you need to decide what type of session you wish to use. If you just want the default in-process session just pass NULL for the session argument. Otherwise, you must create a session first using one of the session creation APIs. See Sessions for more on session.

The cook_options will set the global HAPI_CookOptions for all subsequent cooks. You can also overwrite these options for individual cooks when cooking explicitly but there are cooks that rely on these global options. For example, if you create a node with the option to automatically cook that node right after instantiation then that cook will use these global options. One option of particular importance is the HAPI_CookOptions::maxVerticesPerPrimitive setting. For example, if this were set to 3, then all output will be triangles. When set to 4, the output can be both triangles and quads. To comply with the settings, HAPI will convert more complex geometry into less complex forms that satisfy the constraints, through convexing. To avoid convexing at all, set HAPI_CookOptions::maxVerticesPerPrimitive to -1.

The use_cooking_thread argument indicates whether HAPI should use separate thread for cooking nodes. This is generally a good idea, as doing so would free up the calling thread to other tasks, including displaying the current status of the evaluation to users.

Assuming you choose to use a separate cooking thread, cooking_thread_stack_size lets you choose the stack size of the evaluation thread in bytes. Generally, a larger value is advised, as some nodes are quite complex and require a deep stack space in order to evaluate properly. For reference, Houdini internally reserves 64MB of stack space. Values smaller than this generally work, but if pushed to extremes (1-2mb) stack overflow issues can easily result. A value of -1 for this argument sets the stack to use the Houdini default size. This is the recommended setting if your application can afford the space.

If want to have absolute control over the environment in which HAPI runs, you can override some or all of it using the houdini_environment_files argument. This is a list of paths, separated by a ";" on Windows and a ":" on Linux and Mac, to .env files that follow the same syntax as the houdini.env file in Houdini's user prefs folder. These will be applied after the default houdini.env file and will overwrite the process' environment variable values. You an use this to enforce a stricter environment when running engine. For more info, see: http://www.sidefx.com/docs/houdini20.5/basics/config_env

Some assets require sub-assets to work. The otl_search_path argument refers to the directory where additional OTL files would be searched for. For our Unity plugin, this is set to something similar to:

C:\Users\Public\Documents\Unity Projects\HAPI_unity_<version>\Assets\OTLs\Scanned

The dso_search_path argument basically sets the HOUDINI_DSO_PATH environment variable. It is there so Houdini Engine can take advantage of custom plugins that you may write. If you do not need to take advantage of this feature, you may simply pass NULL for this argument. You can read more about this variable here: http://www.sidefx.com/docs/hdk20.5/_h_d_k__intro__creating_plugins.html

The image_dso_search_path and audio_dso_search_path parmeters are similar to dso_search_path but they set the dedicated HOUDINI_IMAGE_DSO_PATH and HOUDINI_AUDIO_DSO_PATH respectively. Image and Audio plugins are searched in seperate locations than regular node plugins.

Once you make the call to HAPI_Initialize(), all required Houdini dlls will be loaded, and a Houdini scene will be setup underneath the covers. Because Houdini utilizes many dlls, and some of these dlls are public ones that your application may also be using, clashes could result. In general this one call is probably the most difficult part of the entire integration process, where we need to identify potentially clashing dlls and resolve them.

Another note is that HAPI_Initialize() will also be the point at which pythonrc.py, 123.py[cmd], and 456.py[cmd] scripts get run. All of these scripts can live in the Houdini20.5 folder in your home directory. In there, pythonrc.py should be in python2.7libs, while the 123 and 456 scripts should be in scripts. These are scripts you use to define your environment variables, import helper modules, write to files, among other things. For Houdini Engine, it is recommended you use pythonrc.py. More information can be found here: http://www.sidefx.com/docs/houdini20.5/hom/independent

The mirror call to HAPI_Initialize() is HAPI_Cleanup(), which will cleanup the scene underneath. Note, HAPI_Cleanup() will not release any licenses acquired by the process. You'll need to restart the process to release the licenses.

Saving a HIP File

For debugging purposes, it can be very handy to save and load hip files (hip files are Houdini scene files). You can use HAPI_SaveHIPFile() to save the current underlying Houdini scene from Houdini Engine. You can then open this scene file in an interactive session of Houdini to see what Houdini Engine created from your API calls.

For further debugging help, you can use the lock_nodes argument to lock all SOP nodes before saving the scene file. This way, when you load the scene file you can see exactly the state of each SOP at the time it was saved instead of relying on the re-cook to accurately reproduce the state. It does, however, take a lot more space and time locking all nodes like this.

Note that the way the loaded OTLs are tied to the saved Houdini scene differs depending on which APIs were used to load the OTLs. This is decided when loading an OTL and not when saving the scene using HAPI_SaveHIPFile(). See Asset Library Files for details on how to load asset libraries but basically:

  • If you called HAPI_LoadAssetLibraryFromFile() the HIP file saved will only have an absolute path reference to the loaded OTL meaning that if the OTL is moved or renamed the HIP file won't load properly. It also means that if you change the OTL using the saved HIP scene the same OTL file will change as the one used with Houdini Engine.
  • Alternatively, if you called HAPI_LoadAssetLibraryFromMemory() the saved HIP file will contain the OTL loaded as part of its Embedded OTLs. This means that you can safely move or rename the original OTL file and the HIP will continue to work but if you make changes to the OTL while using the saved HIP the changes won't be saved to the original OTL.

    If you do want to use the original OTL definition in a HIP file that has it embedded you can install your OTL using File > Install Digital Asset Library..., making sure the Give preference to assets in this library checkbox is checked. This will automatically replace any instances of the embedded version with the version you just installed.

You can see how OTLs are referenced by the saved HIP file in Houdini by going to Windows > Operator Type Manager, selecting the Operators tab, and expanding all the way to Operator Type Libraries > Current HIP File.

You can see where an specific asset is coming from in Houdini by going to Windows > Operator Type Manager, selecting the Configuration tab, and setting the Operator Type Bar to Display Menu of All Definitions. Then, selecting any asset node will display an Asset Name and Path section above all of its arguments. If you have multiple versions of the OTL loaded at the same time (one being embedded and one being reference by path, for example) you can change between them using these controls.

Getting and Setting Server-Side Environment Variables

You can get or set any arbitrary environment variable on the server-side process. For in-process sessions this will be the current process, otherwise this will be for the HARS process wherever that may live. You can do this via these APIs:

Here's an example of a simple set-and-get:

//
// Set some variables.
//
// Set the int value.
const char * int_var_name = "TEST_ENV_INT";
const int int_value = 42;
result = HAPI_SetServerEnvInt( hapiTestSession, int_var_name, int_value );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
// Set the string value.
const char * str_var_name = "TEST_ENV_STR";
const char * str_value = "bar";
hapiTestSession, str_var_name, str_value );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
//
// Get them back.
//
// Get the int value.
int env_int = 0;
result = HAPI_GetServerEnvInt( hapiTestSession, "TEST_ENV_INT", &env_int );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
HAPI_TEST_ASSERT( env_int == int_value );
// Get the string handle.
HAPI_StringHandle env_str_SH = 0;
hapiTestSession, "TEST_ENV_STR", &env_str_SH );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
// Get the string value.
int env_str_buffer_len = 0;
char env_str_buffer[ 1000 ];
hapiTestSession, env_str_SH, &env_str_buffer_len );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
result = HAPI_GetString(
hapiTestSession, env_str_SH, env_str_buffer, env_str_buffer_len );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
const std::string env_str = env_str_buffer;
HAPI_TEST_ASSERT( env_str == str_value );

Check for Specific Errors

You can check for specific errors by error code across an entire node tree. These checks can be expansive so a separate function was created so the called can determine when the cost is worth the benefit.

The function is HAPI_CheckForSpecificErrors(). It takes a bitfield (HAPI_ErrorCodeBits) of HAPI_ErrorCode that specifics which errors to look for and then returns a HAPI_ErrorCodeBits bitfield that specifies which errors were found. It does this by recursively looking at each node's error messages for all the nodes under the given HAPI_NodeId.

Here's an example that checks for HAPI_ERRORCODE_ASSET_DEF_NOT_FOUND warnings:

// Load the library from file.
HAPI_AssetLibraryId library_id = -1;
HAPI_Result result =
hapiTestSession,
"HAPI_Test_Fundamentals_Warnings_AssetNotFound.otl",
false, &library_id );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
// Instantiate the asset.
HAPI_NodeId node_id = -1;
result = HAPI_CreateNode(
hapiTestSession, -1,
"Object/HAPI_Test_Fundamentals_Warnings_AssetNotFound",
nullptr, true, &node_id );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
// Get specific error.
HAPI_ErrorCodeBits errors_to_look_for =
HAPI_ErrorCodeBits errors_found = 0;
hapiTestSession, node_id,
errors_to_look_for, &errors_found );
HAPI_TEST_ASSERT( result == HAPI_RESULT_SUCCESS );
HAPI_TEST_ASSERT( errors_found & HAPI_ERRORCODE_ASSET_DEF_NOT_FOUND );