Is there a full GPU renderer avaiable for Houdini ?

   69239   54   3
User Avatar
Member
644 posts
Joined: June 2006
Offline
i just wanted to play with SOHO sadly it's only accessible with a full licence (it makes sens because of miss usage, but still a export limitation of example 16k poly would be nicer:-).

so at the moment no play with soho :-/
User Avatar
Member
606 posts
Joined: May 2007
Offline
from this thread [sidefx.com]
jeff
And don't forget to pad the budget a bit to pay for the next bit of kit to take advantage of the various renderers and their vastly improving architectures and capabilities. Take that for what you will.

Someone with an active imagination might get all sorts of ideas from that..
User Avatar
Member
4189 posts
Joined: June 2012
Offline
I'm convinced that GPUs will become like the FPU. integrated into the CPU.

No-one buys an FPU these days.
User Avatar
Member
67 posts
Joined: May 2014
Offline
Octane has just been anounced:

http://render.otoy.com/newsblog/ [render.otoy.com]
User Avatar
Member
655 posts
Joined: Feb. 2006
Offline
Such great news!!! Octane render is so good and fun to use!! :-D
User Avatar
Member
83 posts
Joined: Jan. 2007
Offline
Hello folks !!!

I am Juanjo Gonzalez, the developer of the Octane for Houdini plugin. I hope to have a full featured Octane plugin in the shortest time possible. I am also the developer of the Lightwave plugin, so at least some research about how Octane works as external render engine is already done

Currently the plugin is only a prototype, a simple ROP node, but all is working fine so far. I am working only with the C++ HDK API, not with the SOHO Python architecture, because Octane only has a C++ SDK, and because I feel better working with direct access to the Houdini scene data and node graph from C++ code. I am not sure if I am going to find restrictions without SOHO, but at least the prototype works fine using the HDK, and the loading times are really fast.

I will try to have a beta version as soon as possible, perhaps later this year or early next year, but it is to early to have a closed roadmap. The project is in it's early stages, and I need also to learn a lot of things about Houdini to be sure that the plugin is full featured and integrated with it.

Next month I will open a new thread in this forum to post all the news about the plugin development.

Best regards,
-Juanjo
Computer Graphics Software Developer
User Avatar
Member
67 posts
Joined: May 2014
Offline
Increible Juanjo!

If you need any help, please let us know
User Avatar
Member
16 posts
Joined: Nov. 2012
Offline
I have the Lightwave version, and Juanjo has done an incredible job. Looking forward to the running on Houdini!!
User Avatar
Member
167 posts
Joined: Aug. 2015
Offline
Accidentally run into his topic and figured that it deserves update.
Redshift is available for Houdini now as well
www.redshift3d.com
User Avatar
Member
4 posts
Joined: Aug. 2019
Offline
I found this to be a very interesting read. I arrived here because I found myself asking why my GPU is not being utilised to render my pyro scene in H17.5 with Mantra.

I wonder, over 6 years on since this discussion started, what is the state of play concerning GPU support in Mantra nowadays? Is CPU rendering still the most consistent and reliable, albeit slower?

Please be gentle. I've only been using Houdini for a couple of weeks.
User Avatar
Member
1803 posts
Joined: May 2006
Offline
https://www.youtube.com/watch?v=emcT5qXdUsc&t=46m20s [www.youtube.com]

Karma is the Mantra replacement due in H18, GPU support is planned.
http://www.tokeru.com/cgwiki [www.tokeru.com]
https://www.patreon.com/mattestela [www.patreon.com]
User Avatar
Member
4 posts
Joined: Aug. 2019
Offline
That's great. A good time to get on board!

I'll look forward to Karma when it lands.
User Avatar
Member
899 posts
Joined: Feb. 2016
Offline
mestela
https://www.youtube.com/watch?v=emcT5qXdUsc&t=46m20s [www.youtube.com]

Karma is the Mantra replacement due in H18, GPU support is planned.


I asked few times already, but still haven't found if:
1) it's meant to be xpu: render the frame utilizing both cpu and gpu power at the same time.
2) multi-gpus will be supported.


I'm planning to buy new pc in the near future, and 1) and 2) answers would lead my choice better.
User Avatar
Member
4 posts
Joined: Aug. 2019
Offline
Good question. It would certainly influence my upgrade decisions as well.
User Avatar
Member
833 posts
Joined: Jan. 2018
Offline
Andr
I asked few times already, but still haven't found if:
1) it's meant to be xpu: render the frame utilizing both cpu and gpu power at the same time.
2) multi-gpus will be supported.

I don't have time to go dig around, this has been addressed during the Siggraph presentation and on the forums. Watch the Siggraph presentation and stick it out for the Q&A afterwards.

But to offer answers based on what has been officially said:

1. CPU-only right now, but eventually (H18.5, H19?) they will start tapping into the GPU as well.

2. See answer #1, but I would think yes once that functionality arrives in the future.
>>Kays
For my Houdini tutorials and more visit:
https://www.youtube.com/c/RightBrainedTutorials [www.youtube.com]
  • Quick Links