multi CPU sim vs. single GPU

   9909   6   1
User Avatar
Member
8 posts
Joined: May 2013
Offline
Hello all, I'm going to be setting up some new hardware on a budget soon and I am hoping someone more knowledgeable may help me out with these questions:

I have enough money to either get 2 CPU/RAM based machines, with CPU's equivalent to Core i7 4770 and 16gb RAM each (but no GPUs other than integrated graphics), or one machine with a similar CPU, 32gb and a FirePro W5100 4GB 128-bit.

With the first set-up I'm pretty sure it would be much better for batch rendering. When it comes to simulations however I've read that in some scenarios GPU can run up to 20x faster.

Does that hold true for Houdini? In that case the GPU is the clear winner. Can anyone enlighten me?

Another concern I have is that my processor and/or GPU would bottleneck before coming close to using 32 gb of RAM. I know RAM is very useful in this field, so I don't know if this is even a concern, but how would one determine how much to buy based on their other hardware?

Thanks!
User Avatar
Member
184 posts
Joined: June 2010
Offline
I have a dual Xeon workstation, and cook fluid sims with a 6GB Titan Black (2nd GPU). The Titan is about x10 faster than the CPUs, which from research appears to be about the same as a (much more expensive) Tesla; but note that not all sims support GPU cooking atm.

Bottlenecks in RAM, not sure, or not noticed, but it's a different world when you aren't relying on disk caching (I have 128GB).

Obviously sim resolution and setup is important…the 6GB on my GPU appears to be partly wasted with nVidia drivers as they're still stuck (or were anyway, I haven't checked in a while) in 32bit address space… ATI cards probably aren't, but I've no idea regarding their performance vs nVidia compute.

I've no experience with network batching sims.

Note, you'll still usually want to cook your final sims on CPU a lot of the time when you're after high quality production results (beyond GPU RAM limits), but GPU cooking is extremely useful when fleshing things out. Seek the best of both if you plan to do a lot of sim work and can afford to…and it's always easier to upgrade GPUs over time.
User Avatar
Member
1391 posts
Joined: Dec. 2010
Offline
I don't have any experience about using GPU for large Simulation.
But I know RAM is very useful for simulations.
I have 48 GB ram on my system and it's very useful for Pyro , Particles and Flip Simulations . I made some big simulations on my system very fast and easy , But I couldn't make them on another system with 24 GB ram !

Another important think for increasing speed of the simulation is your Disk !
Usually some of simulations (specially Pyro ) has very huge size for it's own cache files.

Or sometimes we should cache our simulation to the disk , then we should read them from disk for rendering , In this operation Read/Write time for huge simulation files is very important.

So I suggest you to buy a SSD drive only for Simulations and Cache too :idea:
https://www.youtube.com/c/sadjadrabiee [www.youtube.com]
Rabiee.Sadjad@Gmail.Com
User Avatar
Member
8 posts
Joined: May 2013
Offline
Thanks for the replies, I decided to go with 1 machine instead of 2. The CPUs would be useful, but with the projects I have in mind I think the GPU will be a better buy for me. This way I get the GPU, a more powerful processor (and I can still network to my current machine for rendering or non-GPU based computation) and 32gb of RAM.

Also I may be able to swing a SSD going this route (thanks for the suggestion Joker). It may have to wait, but it's an easy upgrade.

One of my concerns with this build however is that this motherboard only has 1 PCI.e 3.0, so were I to get a second GPU for the display I'd have to go with a 2.0 card. I think that will work but still need to look into it.

Intel Core i7-4790K 4.0GHz
FirePro W5100 4GB 128-bit
PNY Optima 480GB 2.5" Solid State Drive
MSI Z97 PC MATE ATX LGA1150 Motherboard
PNY 32GB (4 x 8GB) DDR3-1866 Memory
Antec TruePower Classic 550W 80+ Gold Certified ATX Power Supply

I think this is the best I can do on my current budget, thank you both again for your feedback and suggestions.

P.S. I found a post regarding a 64bit titan black driver for Linux, not sure what OS you run but it may be out there by now.

http://www.nvidia.com/download/driverResults.aspx/73666/en-us [nvidia.com]
User Avatar
Member
1391 posts
Joined: Dec. 2010
Offline
One of the annoying things if is incompatible with Houdini , is VGA .

So I suggest you to research more about your VGA , nVidia or ATI !
Of course both are great and perfect , But Houdini (and Others) has some different problems with different VGA types and models.

Although maybe ATI models are cheaper than similar model in nVidia , But I really like GTX series , Specially Titan , It is very powerful .
(I don't say ATI is not good , I just say GTX is better for me )

But you can search more about it in the Houdini Forum (and odforce) , read people's problems about different VGA models and Houdini , Then choose your VGA :idea:
https://www.youtube.com/c/sadjadrabiee [www.youtube.com]
Rabiee.Sadjad@Gmail.Com
User Avatar
Member
8 posts
Joined: May 2013
Offline
I ended up going with this:

i7 4790k 4.0 ghz
GTX970 and a GTX750ti (750ti I'm using for display, but may switch it out for an older GPU I have and use it for gaming on my older PC instead since it is overkill for just dual display )
480GB SSD
2 TB HDD
MSI Z97 PC MATE
32GB RAM

So far the GTX970 is amazing when running openCL sims and no compatibility issues at this point. I ran a smoke solver at 400^3 at 47fpm, on CPU it was at about 3.5.

When it comes time to render or do other kinds of sims I will wish I had gone with multi CPU, but at this point I feel good about going with more powerful GPUs.
User Avatar
Member
2625 posts
Joined: June 2008
Offline
Ok, so I'll bite.

I am new to Houdini and still getting the hang of the terms.

Is cooking the same thing as a sim?

The reason I ask is that in my example and learning projects I find it is easy to make something that locks up the machine. I can no longer animate because every time I change the frame cooking occurs. I am on an 8 core machine but cooking only seems to use a single core.

Is there another way to bake/cook other than just playing the animation from the timeline?

How do I make Houdini use all the cores of my CPU?

Here is an example file [forums.odforce.net] that takes forever and will not playback on my i7 8 core with 16GB ram. I am stuck in an infinite cook loop.
Using Houdini Indie 20.0
Windows 11 64GB Ryzen 16 core.
nVidia 3050RTX 8BG RAM.
  • Quick Links