what GPU to buy?

   15725   24   2
User Avatar
Member
5 posts
Joined: 5月 2015
Offline
Hello everyone!

I just started Learning Houdini and it looks very promising. After 3ds Max and Maya I really enjoy Houdini. Unfortunately I have not tried rigging and complex animation, but it is next in my list. If I will enjoy rigging and animation in Houdini I will purchase it for my small business. At the moment I’m playing with small particle simulations and 20-50k Polygon characters and my GTX 770 doing a good job.

Im preparing photo-realistic characters (around 100k poly) for my project and complex 2m+ polygon scenes with particle simulations and im not sure that 2GB GTX770 will be able to handle this.

Please advise which GPU I should get. My budget is around 1500$
Which will perform better with Houdini Quadro K4200/K5200 or GTX Titan Black/X?

I have heard that Titan X is dual Gpu card and that Houdini won’t benefit from that. Please advice.
User Avatar
Member
2625 posts
Joined: 6月 2008
Offline
Here is my experience/misconception about Houdini and GPU.

There is no GPU accelerated rendering via Mantra (yet). So you get no rendering benefit from having a GPU in your system.

For simulations I see the FLIP accelerated about 30%.
Sometimes when I enable OpenCL for Pyro, Houdini simply crashes.

I don't think there is any particle acceleration via GPU.

For $1,500.00 bucks you might just want to buy more CPUs.
Using Houdini Indie 20.0
Windows 11 64GB Ryzen 16 core.
nVidia 3050RTX 8BG RAM.
User Avatar
Member
5 posts
Joined: 5月 2015
Offline
Enivob
Here is my experience/misconception about Houdini and GPU.

There is no GPU accelerated rendering via Mantra (yet). So you get no rendering benefit from having a GPU in your system.

For simulations I see the FLIP accelerated about 30%.
Sometimes when I enable OpenCL for Pyro, Houdini simply crashes.

I don't think there is any particle acceleration via GPU.

For $1,500.00 bucks you might just want to buy more CPUs.

Good Morning! Thank you for your reply.

Im not talking only about Rendering itself. GPU is in system requirements and my GPU is listed in about section as a renderer, so I believe it is doing something.

I have seen video on vimeo with 2 simulation tests, one with OpenCL and one without. Open CL simulation was faster. I think it was Titan GPU if I’m not mistaken.

I need to know which GPU will be better for viewport and particle simulations. Quadro or Titan.
Edited by - 2015年5月19日 10:56:41
User Avatar
Member
655 posts
Joined: 2月 2006
Offline
My experience with the Nvidia GTX780 was awesome, with the K4000, K5000 and K4200 not so.

My feeling is that you want to have gaming cards with lots of memory (that is the key) and you will be happy.

At Glassworks we use a GPU renderer and it loves the Titans so I would say it is a good bet.

Anyone could confirm?

hope it helps
User Avatar
Member
334 posts
Joined: 7月 2007
Offline
jordibares
My experience with the Nvidia GTX780 was awesome

Same here Very good preformance at good price!
www.gimpville.no
User Avatar
Member
5 posts
Joined: 5月 2015
Offline
Thank you all,

Going to order GTX titan Black it has 2X more Memory then 780 and some other benefits. Should cover my gpu need for couple a years.
User Avatar
Member
1755 posts
Joined: 3月 2014
Offline
You'd be better off with a Titan X - 12Gb vRAM, 5760 CUDA cores.
User Avatar
スタッフ
5202 posts
Joined: 7月 2005
Offline
The Titan X is a dual GPU card though, so it's better to think of it as having 6GB x 2 since the memory is not shared between GPUs. It would be decent for having OpenGL viewport rendering on one of the GPUs, and OpenCL sims on the other.
User Avatar
Member
4189 posts
Joined: 6月 2012
Offline
Andrej Klimov
Thank you all,

Going to order GTX titan Black it has 2X more Memory then 780 and some other benefits. Should cover my gpu need for couple a years.

Could be a good plan, Nvidia Pascal will be here in ~2 years and with claims of 10x faster we will have to upgrade!
User Avatar
Member
18 posts
Joined: 10月 2010
Offline
twod
The Titan X is a dual GPU card though, so it's better to think of it as having 6GB x 2 since the memory is not shared between GPUs. It would be decent for having OpenGL viewport rendering on one of the GPUs, and OpenCL sims on the other.

Are you sure? I read everywhere that it was a single gpu card.
User Avatar
スタッフ
5202 posts
Joined: 7月 2005
Offline
Shasha
twod
The Titan X is a dual GPU card though, so it's better to think of it as having 6GB x 2 since the memory is not shared between GPUs. It would be decent for having OpenGL viewport rendering on one of the GPUs, and OpenCL sims on the other.

Are you sure? I read everywhere that it was a single gpu card.

You're right - I was thinking of the Titan Z, which is the dual-GPU card (good marketing, there :? ).
User Avatar
Member
18 posts
Joined: 10月 2010
Offline
twod
Shasha
twod
The Titan X is a dual GPU card though, so it's better to think of it as having 6GB x 2 since the memory is not shared between GPUs. It would be decent for having OpenGL viewport rendering on one of the GPUs, and OpenCL sims on the other.

Are you sure? I read everywhere that it was a single gpu card.

You're right - I was thinking of the Titan Z, which is the dual-GPU card (good marketing, there :? ).

Ah, I see. I have my eyes on the Titan X for a while now, but I dunno if it's really worth it.

Anyone tried it with Houdini?
User Avatar
Member
83 posts
Joined: 1月 2007
Offline
If at some point you are planning to work with a GPU renderer, like Octane, the Titan X is the way to go. It has 12 GB of RAM and 3000 CUDA cores, so you can render really huge scenes with it, specially with Octane 3 that is going to remove the current geometry amount limitations.

For smaller scenes also the GTX 980 is a great GPU, or even the Titan Z, but the 12GB of the Titan X is a game changer for rendering.

-Juanjo
Computer Graphics Software Developer
User Avatar
Member
5 posts
Joined: 5月 2015
Offline
Thank you all!

I looked away for just 1 night and missed so much information. Yes you right Titan X does look more tempting and it is cheaper. Wow nvidia Pascal is out in 2016 and will be awesome. Probably will go for GTX 980 or Titan for this year and then invest in to Pascal.

When I read descriptions like “you will be able to render big scenes” I always have following question, how big in polygons? It is always nice to know hardware limitations.
Edited by - 2015年5月20日 06:04:36
User Avatar
Member
83 posts
Joined: 1月 2007
Offline
Andrej Klimov
When I read descriptions like “you will be able to render big scenes” I always have following question, how big in polygons? It is always nice to know hardware limitations.

With Octane 2.2 you can render scenes with about 20 MPolygons (unique polygons, without instancing), and perhaps the same amount of instances. Octane 3.0 is going to remove this limitations. The texture maps are not limited at all, even if they don't fit in the GPU RAM, because Octane has an out-of-core texture rendering feature to hold the textures in the CPU RAM if there is not enough GPU free RAM available.

The GPU RAM is also important to store the frame buffer and the render passes, this is why the TitanX with 12 GB is the way to go

-Juanjo
Computer Graphics Software Developer
User Avatar
Member
5 posts
Joined: 5月 2015
Offline
juanjgon
Andrej Klimov
When I read descriptions like “you will be able to render big scenes” I always have following question, how big in polygons? It is always nice to know hardware limitations.

With Octane 2.2 you can render scenes with about 20 MPolygons (unique polygons, without instancing), and perhaps the same amount of instances. Octane 3.0 is going to remove this limitations. The texture maps are not limited at all, even if they don't fit in the GPU RAM, because Octane has an out-of-core texture rendering feature to hold the textures in the CPU RAM if there is not enough GPU free RAM available.

The GPU RAM is also important to store the frame buffer and the render passes, this is why the TitanX with 12 GB is the way to go

-Juanjo

I just started my 3D journey, so Never heard of Octane Renderer. Will put on my homework list to read about it and test.

At the moment I Use Keyshot with Zbrush to Keyshot bridge which is easy to use and amazing render quality.

I probably will buy Houdini Indie and it does not support any third party renders. Can I still import whole scene to Octane?
User Avatar
Member
83 posts
Joined: 1月 2007
Offline
Indie is not going to support the Houdini Octane plugin, sorry.

-Juanjo
Computer Graphics Software Developer
User Avatar
Member
66 posts
Joined: 9月 2008
Offline
For the time being CUDA (Nvidia Based) cards tend to work better across different platforms and programs, as it's more mature and has a lot of financial backing.

OpenCL (Intel/AMD/ATI) is like the Blender of GPU rendering, it works great when and where it works but might not be the best investment for a CGI artist these days.
But this is an evolution, so they will even out somehow over time.

Personally I've run both Blender, Maya, Houdini and XSI on both “Pro Series” Quadro (Nvidia) and FireGL (AMD), and to be quite honest the major price difference does not give you bang for the buck, so go for a modern GeForce GTX, possibly set up as an SLI pair and you should be good to go with the most demanding third party renderers that uses GPU's. Nvidia also has pretty decent OpenCL support and excellent OpenGL support, and it's the OpenGL that matters in Houdini combined with your good old Quad Xenon CPU… :-)
“If your life is not NOW, you're already dead…”
From Chrizto's book of truths
User Avatar
Member
66 posts
Joined: 9月 2008
Offline
juanjgon
Indie is not going to support the Houdini Octane plugin, sorry.

-Juanjo

But standalone Renderman works as usual in Indy doesn't it?
Also there is like two new projects a month claiming to be the latest and greatest GPU based renderer.

Maxwell and Houdini has always worked for me.

I think SideFX will loose A LOT of customers if Indie Customers are forced into using only Mantra and not standalone or plugin 3rd party renderers…

(Mantra is a great renderer, but it's as old as most people in here's moms (not as old as my mom though)…
:roll:
“If your life is not NOW, you're already dead…”
From Chrizto's book of truths
User Avatar
Member
379 posts
Joined: 12月 2006
Offline
Not it will not lose users at all. Really I fail to see why that might happen.

And saying Mantra is old … that is plain ignorance. Your knowledge might not be good enough.
Edited by - 2015年5月24日 09:52:45
  • Quick Links