morganv123
2021年3月9日 11:41:35
Hello,
I've been working on photo mosaics. My HIP file below reads in a file (made with Python in another HIP). For each prim, there's a texture file referenced so in a big loop, I load each texture into "attribute from map", promote Cd to a detail attribute with 'average' setting and store out to disk. So essentially each image on my disk now has an average color associated with it.
It works, only there seems to be a memory leak, my Houdini footprint goes from 500Mb all the way to 10Gigs and it incrementally slows down. On a dataset of 6000 images it just works but even after caching to disk, the memory stays bloated until I restart and then I can work from the cache no problem.
The trouble is I want to this with 200,000 images and Houdini will just crash after about 20,000 or so.
It seems to be storing every texture in memory and then some for each iteration of the loop until memory runs out.
I've run this on 18.5.462 and the latest build 511. Any help would be greatly appreciated!
morganv123
2021年3月9日 21:20:15
I managed to get warmer but still no fix:
Invoking the "Reload Texture" button on the Attribute from Map node clears out the texcache. The memory gets released immediately. It makes sense that Houdini is caching ever texture loaded into that node forever.
I tried using texcache a- off in the Hscript Textport. It did something and reduced the load by 3 GIGS of ram but it's still creeping up to 7 Gigs of ram. So something else is going on.
After turning off texcache, the Cache Manager window doesn't show that anything is being cached anywhere. I'm clueless as to where these extra GBs are getting allocated.
If you're able to convert your images to either .rat or MIP Mapped .exr files (make sure the files are tiled and MIP Mapped, not just standard .exr files), you might find better performance if you're using VEX to read the textures.
You can use imaketx
to convert the images.
Tiled, MIP Mapped .exr and .rat image files are built to handle large numbers of texture maps in a single session.
morganv123
2021年3月10日 10:26:33
Thank you, Mark. In the meantime I inserted a python sop to call texcache -c and sopcache -c every loop cycle which slows the process down considerably but the memory now holding steady. 7 hours of cooking to go...
mark
If you're able to convert your images to either .rat or MIP Mapped .exr files (make sure the files are tiled and MIP Mapped, not just standard .exr files), you might find better performance if you're using VEX to read the textures.
You can use imaketx
to convert the images.
Tiled, MIP Mapped .exr and .rat image files are built to handle large numbers of texture maps in a single session.
On a side node, would viewport performance benefit of .rat images instead of regular png?
I need to display in the viewport thousands of textured cards, with alpha too, and noticed low fps when they overlap a lot.
(using the principled shader to load the textures)
morganv123
2021年3月11日 11:34:41
I finally rebuilt the system using TOPS to run through each image as a work item. The Attribute from Map SOP still causes serious CPU load and it's exponentially slower the more it runs. I'm having to do batches of 10,000 images at at time, restarting houdini each run to clear the memory (or processes?). It definitely feels like a bug and there is no caching going on anywhere that I can see.
morganv123
2025年4月7日 09:53:26
Dear support team,
I'm bumping this topic up. I'm doing a lot of work with video and converting upwards of 3000 frames of video to SOP geometry with the Attribute From Map node. As you can see from this memory graph, memory gets used incrementally finally causing Houdini to crash.
The larger the resolution of image files that get loaded into the node, the faster the memory gets eaten up.