Karma XPU barely using GPU

   22118   46   4
User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
While testing new improvements to Karma XPU and MaterialX SSS, I've noticed that Karma XPU in 19.5 barely uses my GPU. According to nvidia-smi, Karma's GPU usage overhead barely sticks out from the background noise, and the overall load never exceeds twenties. This is somewhat different from 19.0, where XPU was using almost 100% of the GPU processing power.
At the same time, Karma fully saturates all of my CPU cores.

Has anyone noticed a similar problem?

Some specs: Linux kernel 5.19.0, RTX 3070, nvidia-driver 470.129.06, CUDA 11.4.3
User Avatar
Member
2629 posts
Joined: June 2008
Offline
I see Cuda under full load, but 3D around 21%. This render uses the XPU Pyro material node. Can you post your problem scene? Are you using all mtlx materials?
Edited by Enivob - July 24, 2022 11:07:24

Attachments:
Untitled-1.jpg (118.3 KB)

Using Houdini Indie 20.0
Windows 11 64GB Ryzen 16 core.
nVidia 3050RTX 8BG RAM.
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
ajz3d
While testing new improvements to Karma XPU and MaterialX SSS, I've noticed that Karma XPU in 19.5 barely uses my GPU. According to nvidia-smi, Karma's GPU usage overhead barely sticks out from the background noise, and the overall load never exceeds twenties. This is somewhat different from 19.0, where XPU was using almost 100% of the GPU processing power.
At the same time, Karma fully saturates all of my CPU cores.

Has anyone noticed a similar problem?

Some specs: Linux kernel 5.19.0, RTX 3070, nvidia-driver 470.129.06, CUDA 11.4.3

What is power usage? GPU usage doesn't always mean much.

Try using GPU-Z or MSI Afterburner/EVGA X1 to get a more detailed look act power consumption.

Here's a screenshot of Karma XPU rendering a 4k full frame render of a single object with an sss material:

Attachments:
gpuz_karma_xpu.png (81.7 KB)

User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
Hi guys,

jsmack
What is power usage?
GPU power usage while rendering peaks at about 28 Watts. It's so low that fans of the video card won't even start revolving (they're at 0% throughout the whole rendering). Unfortunately, nvidia-smi doesn't provide information about CUDA load and I don't know of any GNU/Linux software that would do this.

jsmack
Try using GPU-Z or MSI Afterburner/EVGA X1 to get a more detailed look act power consumption.
This software is exclusive to Windows, if I'm not mistaken.

Enivob
Can you post your problem scene? Are you using all mtlx materials?
My test scene is simple as I just wanted to test out XPU SSS. One object with one MaterialX standard surface material assigned, dome light and two area lights. I triple checked if Karma Render Settings was set to XPU and that it has the display flag on.

Geometry of the model used in the scene, as well as its textures, can be downloaded from here: https://ir-ltd.net/infinite-3d-head-scan-released/. [ir-ltd.net]
I believe it's too large to upload here, so I'm posting the link.
Edited by ajz3d - July 24, 2022 14:50:19

Attachments:
mx_sss.hiplc.zip (71.5 KB)

User Avatar
Member
2629 posts
Joined: June 2008
Offline
After recreating your scene, Karma still produces the same results, near 100% cuda load during render. If you think about it, you want low power consumption. That means a smaller electric bill.

Attachments:
Untitled-1.jpg (170.3 KB)

Using Houdini Indie 20.0
Windows 11 64GB Ryzen 16 core.
nVidia 3050RTX 8BG RAM.
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
ajz3d
Hi guys,

jsmack
What is power usage?
GPU power usage while rendering peaks at about 28 Watts. It's so low that fans of the video card won't even start revolving (they're at 0% throughout the whole rendering). Unfortunately, nvidia-smi doesn't provide information about CUDA load and I don't know of any GNU/Linux software that would do this.

jsmack
Try using GPU-Z or MSI Afterburner/EVGA X1 to get a more detailed look act power consumption.
This software is exclusive to Windows, if I'm not mistaken.

Enivob
Can you post your problem scene? Are you using all mtlx materials?
My test scene is simple as I just wanted to test out XPU SSS. One object with one MaterialX standard surface material assigned, dome light and two area lights. I triple checked if Karma Render Settings was set to XPU and that it has the display flag on.

Geometry of the model used in the scene, as well as its textures, can be downloaded from here: https://ir-ltd.net/infinite-3d-head-scan-released/. [ir-ltd.net]
I believe it's too large to upload here, so I'm posting the link.

It's probably not using the gpu then. Does it say "failed" under optix? I find you need way more GPU memory than you think with XPU. 8gb on a 3070 might not be enough for even a single model with textures.
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
ajz3d
Some specs: Linux kernel 5.19.0, RTX 3070, nvidia-driver 470.129.06, CUDA 11.4.3

Just saw the driver version here. the docs specify a minimum driver version of 511.09 for linux

Edit:
where did you get the SSSMap.png? It was not in the archive I downloaded.

using the main color map for sss, and your scene's settings I'm seeing a result where the render completes before the GPU can even start. Cranking up the samples and resolution, and disabling the denoiser I see the GPU able to contribute at full power. I think Optix just takes too long to spool up for such a simple render.
Edited by jsmack - July 24, 2022 17:48:40
User Avatar
Member
238 posts
Joined: Nov. 2013
Offline
just a side note.
I am using
nvidia-smi dmon
under linux. It gives a more accurate and de-bloated overview
http://www.sekowfx.com [www.sekowfx.com]
User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
Enivob
After recreating your scene, Karma still produces the same results, near 100% cuda load during render. If you think about it, you want low power consumption. That means a smaller electric bill.
Ah yes, that's a valid positive aspect of it. But to be honest, I'd rather prefer short bursts of maximum power.

jsmack
Does it say "failed" under optix?
As a matter of fact it does. I launched Houdini from terminal, and it complains that:
KarmaXPU: Failed to initialize Optix 70400 [Unsupported ABI version] (maybe old driver? requires 495.46+)
And that's not good, because on Debian (even on Testing, which I'm using) there's no perspective of upgrading this driver to the required version in the near future. 510.x has already been stuck in Debian experimental since early June, and before that there was another 5xx version which wasn't pushed even to Sid. The next version which will hit Testing is most likely 470.129.06, which still doesn't match Houdini requirements.

I'd like to avoid installing the driver directly from upstream, as there's a high probability of it messing up my system and giving life to a Frankendebian (https://wiki.debian.org/DontBreakDebian).

Anyway, H19 had identical nvidia driver requirements, and yet GPU utilization was always at 100%, even when rendering simple USD scenes.

For example, I just made a really trivial scene (gpu_test_h19) consisting of spheres copied to points, one MaterialX shader and a dome light. I rendered it with H19.0 and H19.5. Then, I recreated the scene using operators from H19.5 (gpu_test_h19_5) and rendered it with this new Houdini version. The results are as follows:
- gpu_test_h19 renders at 100%/100% CPU/GPU load with H19.0 (it reaches full load almost immediately). When rendered with H19.5, GPU remains practically unused.
- gpu_test_h19_5 on H19.5 renders at 100% CPU and almost no GPU load.

The difference in rendering speed of gpu_test_h19 with H19.0 and rendering gpu_test_h19/gpu_test_h19_5 with H19.5 is night and day.
I wonder which aspects of Karma XPU were changed so much in H19.5, that they now require a newer ABI version.

You will find both scenes in the attachment.

jsmack
I find you need way more GPU memory than you think with XPU. 8gb on a 3070 might not be enough for even a single model with textures.
I'm not too sure about this. I mean, I never experienced that. On H19.0 I rendered quite a few scenes, some very simple, some quite heavy on textures and geometry, and GPU was always at 100% (175+ Watts) almost right off the bat. No matter what. I also never witnessed the XPU renderer running out of VRAM. It topped it, sure, but the GPU load was maintained at maximum level, so it always had something to calculate.

jsmack
where did you get the SSSMap.png? It was not in the archive I downloaded.
I don't remember. I downloaded this model years ago. If this map doesn't come with the archive, then I must have derived it from existing textures.

sekow
I am using
nvidia-smi dmon
I didn't now about this. Thanks.
Much cleaner output than watch -n 0.5 nvidia-smi, so definitely very useful.
Edited by ajz3d - July 25, 2022 06:34:27

Attachments:
gpu_test.zip (96.0 KB)

User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
Okay, so I believe my nvidia-driver version is indeed the culprit. First that fail message in H19.5, then I notice that OptiX doesn't kick in at all, so Karma XPU renders only on Embree. Damn.

Attachments:
no_optix.png (21.6 KB)

User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
ajz3d
The difference in rendering speed of gpu_test_h19 with H19.0 and rendering gpu_test_h19/gpu_test_h19_5 with H19.5 is night and day.
I wonder which aspects of Karma XPU were changed so much in H19.5, that they now require a newer ABI version.

It's possible that updating the Optix library used by Houdini also bumps the minimum ABI.

ajz3d
And that's not good, because on Debian (even on Testing, which I'm using) there's no perspective of upgrading this driver to the required version in the near future. 510.x has already been stuck in Debian experimental since early June, and before that there was another 5xx version which wasn't pushed even to Sid. The next version which will hit Testing is most likely 470.129.06, which still doesn't match Houdini requirements.

I though nVidia drivers were totally outside the open source stream and had to be installed directly from nVidia.
User Avatar
Member
1718 posts
Joined: March 2009
Offline
jsmack
I though nVidia drivers were totally outside the open source stream and had to be installed directly from nVidia.

Most distros package them these days (practically all the mainstream ones).
Martin Winkler
money man at Alarmstart Germany
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
protozoan
jsmack
I though nVidia drivers were totally outside the open source stream and had to be installed directly from nVidia.

Most distros package them these days (practically all the mainstream ones).

So does Windows update, but you should never use that one.
User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
It's completely different on Linux. Installing drivers or firmware blobs directly from upstream is usually a recipe for trouble. The main rule to have a stable system is to always use packaged versions provided by maintainers in the official repository of your GNU/Linux distribution.

jsmack
It's possible that updating the Optix library used by Houdini also bumps the minimum ABI.
Actually, this gives me an idea. We can override OptiX path with HOUDINI_NVIDIA_OPTIX_DSO_PATH, so perhaps an older OptiX version will do the trick. I'll check it out.
Never mind. I think this envar is for denoiser only.
Edited by ajz3d - July 25, 2022 18:58:43
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
Is Houdini going to stay viable on Linux? I see posts every day about trouble running Houdini on Linux distos newer than 4 or 5 years old. That newer linux distros don't even ship with the libraries houdini needs anymore. At work it seems like we're just gonna be time capsuled on CentOS 7 forever.
User Avatar
Member
1718 posts
Joined: March 2009
Offline
jsmack
Is Houdini going to stay viable on Linux? I see posts every day about trouble running Houdini on Linux distos newer than 4 or 5 years old.

Interestingly, I find that (Blender aside, those guys are a special case) Houdini does it best, of all the softwares. I've been making a sport out of running H on bleeding edge rolling release distros for over a decade, and occasionally you have hiccups like the glibc 2.35 thing a while back, but those are typically dealt with quickly.

I admit I have been a thorn about this in jeff's/chrism's/mark's sides (and the others) but they are really dealing with it.

You can run 19.0 and 19.5 today on fully rolled up rolling distros like Arch, and it will work. No special requirements, no tinkering. The fact that some people miss some libs is a different issue entirely and mostly to do with distros getting more minimal out of the box.
Martin Winkler
money man at Alarmstart Germany
User Avatar
Member
580 posts
Joined: Aug. 2014
Offline
jsmack
Is Houdini going to stay viable on Linux? I see posts every day about trouble running Houdini on Linux distos newer than 4 or 5 years old. That newer linux distros don't even ship with the libraries houdini needs anymore. At work it seems like we're just gonna be time capsuled on CentOS 7 forever.
I think that's a slight overdramatization. I can recall only one major problem GNU/Linux users had over the past year (from those reported here, on the forum). It had to do with an upgrade of glibc library version in some distros (Ubuntu, Fedora) to which Houdini wasn't ready. It's already been patched, and also didn't affect me in any way, because my OS comes with an older version of this library.

My experience with GNU/Linux and Houdini so far, is that it is a rock stable combination, which very rarely causes trouble. In fact, I had much more issues while running Houdini on Windows for a brief period (H13, H14), than I ever had on Linux (H15-H19.5).

Regarding the current problem, I believe it's my distro that is to blame, rather than SideFX being too hasty in bumping OptiX requirements, because on Debian Testing (Bookworm) there haven't been any major version upgrades of nvidia-driver for almost a year (they bumped 460.91 to 470.57 in late August '21, if I'm not mistaken), while other distros, even those based on Debian itself, have cheerfully moved to 510.xx.

If this slow trend doesn't change, I might be forced to migrate to some other distribution. This would be difficult for me, not only because moving the whole working environment is time-consuming and requires careful planning, but also because I highly value DFSG, reproducible builds, a fantastic package manager, and a clear division between free, contrib and non-free software that Debian provides me with. It's also one of the oldest distros out there, so it has a large developer, user and software base, and because of that, it's usually trivial to fix most of the problems that may come up.

EDIT: Added Debian version.
Edited by ajz3d - July 28, 2022 05:16:08
User Avatar
Member
8045 posts
Joined: Sept. 2011
Offline
ajz3d
I think that's a slight overdramatization. I can recall only one major problem GNU/Linux users had over the past year (from those reported here, on the forum). It had to do with an upgrade of glibc library version in some distros (Ubuntu, Fedora) to which Houdini wasn't ready. It's already been patched, and also didn't affect me in any way, because my OS comes with an older version of this library.

Probably, I'm just going of a gut feeling from forum posts. I think it's mostly Ubuntu related posts about missing libraries or possible an incompatible qt version.
User Avatar
Member
157 posts
Joined: July 2005
Offline
I'm having the same problem with Houdini 19.5.303 Karma XPU not using the GPU's on my Debian 11 system.

Have been installing the (never current) Debian nVidia drivers (usually from backports) for years, and running Houdini/RedShift (since RS version 2.6.30) without any problems.

Because I'd like to make Karma XPU my main renderer, I took the plunge (June 23rd) and followed the instructions to install the 515.48.07 nvidia driver from here...

https://www.linuxcapable.com/install-510-nvidia-drivers-on-debian-11-bullseye [www.linuxcapable.com]

It was successful, and everything has continued to run fine since.

In tests with Houdini 19.0.???, Karma XPU was running both my cards (RTX 2070 Super & GTX 1080) at around 100%.

In tests with Houdini 19.5.303, Karma XPU doesn't appear to use the cards at all. And, I'm also getting the "Unable to create optix context for device..." errors.

(Not sure why optix wasn't included with that 515.48.07 install.)

However, when I render with Redshift 3.0 or 3.5 versions, and Houdini 19.0 or 19.5 versions, the cards are both running at 100%.

Yes, I get "The denoiser will be disabled" error from Redshift, but it still renders with the GPUs near 100%.

Anyway, main point is, Redshift is running fine on my Debian 11 system with Houdini 19.5. Karma XPU is not.

P.S. Monitored GPU usage with "nvidia-smi dmon", "psensor", & "NVIDIA Settings". All were in agreement.
Edited by fgillis - July 28, 2022 03:22:29

Attachments:
nvidia_setting.png (163.1 KB)

Floyd Gillis
User Avatar
Member
123 posts
Joined: Sept. 2018
Offline
fgillis
However, when I render with Redshift 3.0 or 3.5 versions, and Houdini 19.0 or 19.5 versions, the cards are both running at 100%.

But are you using Hardware Acceleration /Hardware Ray Tracing in Redshift? I believe that is what they call using Optix for ray tracing. This would also explain why the Denoiser doesn't work as it's also using Optix.

It will be interesting to see if Intel Embree will bring GPU rendering to all GPUs at some point. I've read a while back that they at least want it running on their Arc GPUs.
  • Quick Links