Which one is better if I want to use Houdini/Manta?
More CPU such as this:
http://www.workstationspecialist.com/product_range/ws_4024/ [workstationspecialist.com]
or more GPU such as this:
http://www.workstationspecialist.com/hpc/personal_super_computer/ [workstationspecialist.com]
With all the simulation, rendering, calculation, physic and all - I want to know which is the best hardware for the job.
BTW - while I'm sure application have used to multiple cpu setting, does software have adapt to GPU with 8 card inside a computer?
more CPU or more GPU?
6193 5 5- fablefox
- Member
- 13 posts
- Joined: Nov. 2010
- Offline
- old_school
- Staff
- 2540 posts
- Joined: July 2005
- Offline
Given that Houdini 11 already utilizes the graphics card extensively for the viepwort, both. Both cpu and gpu are now important for performance in Houdini11.
Nvidia seems to be the best for all platforms at this given time with regards to drivers (Linux especially). If it's windows or Mac, ATI Cards seem to hold their own as well. But there is this thing called “cuda” that is quite attractive and even though Nvidia will support cuda on general cpu's, you could assume that performance would be better on an Nvidia graphics cards. There are a couple projects going on in Houdini by users that could make use of cuda. Just keep your options open.
If you are a developer, cuda and nphysics are quite attractive to quickly build up sims on a gpu.
I would imagine that the next major version of Houdini may be utilizing the gpu and cpu's even more effectively if you measure the improvements from H9 > H10 > H11 in this area.
Let's hope Apple keeps up as it is quite off pace now with support for OpenGL2.1 in Snow Leopard 10.6.4 only.
Nvidia seems to be the best for all platforms at this given time with regards to drivers (Linux especially). If it's windows or Mac, ATI Cards seem to hold their own as well. But there is this thing called “cuda” that is quite attractive and even though Nvidia will support cuda on general cpu's, you could assume that performance would be better on an Nvidia graphics cards. There are a couple projects going on in Houdini by users that could make use of cuda. Just keep your options open.
If you are a developer, cuda and nphysics are quite attractive to quickly build up sims on a gpu.
I would imagine that the next major version of Houdini may be utilizing the gpu and cpu's even more effectively if you measure the improvements from H9 > H10 > H11 in this area.
Let's hope Apple keeps up as it is quite off pace now with support for OpenGL2.1 in Snow Leopard 10.6.4 only.
There's at least one school like the old school!
- cybermax
- Member
- 255 posts
- Joined: Aug. 2009
- Offline
I really think that you dont need tesla cards for H11 8).
Today is the best choice powerful computers with more cores. But it is also important to be strong only one core, because a large part of Houdini is a one-thread. DOP is multiple-thread and some COP using GPU, but GPU is mainly for viewport(s).
Otherwise, the future of simulations (mainly SPH) is in the hands of the GPU, which have more cores than CPU, and each particle can be “transform” on one core, so there is a huge speed increase(Tesla card rocks).
I am working last 2months with OpenCL and I can create simulations which DOPs could only dream on my hardware :shock:
Regarding the rendering, so lately there are more and more GPU renderers, but there are different views:
http://www.cgchannel.com/2010/08/gpu-vs-cpu-rendering-talk-by-luxology/ [cgchannel.com]
Today is the best choice powerful computers with more cores. But it is also important to be strong only one core, because a large part of Houdini is a one-thread. DOP is multiple-thread and some COP using GPU, but GPU is mainly for viewport(s).
Otherwise, the future of simulations (mainly SPH) is in the hands of the GPU, which have more cores than CPU, and each particle can be “transform” on one core, so there is a huge speed increase(Tesla card rocks).
I am working last 2months with OpenCL and I can create simulations which DOPs could only dream on my hardware :shock:
Regarding the rendering, so lately there are more and more GPU renderers, but there are different views:
http://www.cgchannel.com/2010/08/gpu-vs-cpu-rendering-talk-by-luxology/ [cgchannel.com]
https://vimeo.com/user3251535 [vimeo.com]
https://twitter.com/milansuk [twitter.com]
https://github.com/milansuk [github.com]
https://twitter.com/milansuk [twitter.com]
https://github.com/milansuk [github.com]
- fablefox
- Member
- 13 posts
- Joined: Nov. 2010
- Offline
Thanks for the video, specially it's from Luxology (I plan to use Modo as non-hero model modeller).
What's great about the video is that now I know there's a personal render farm from BOXX (there was a personal render farm that certain renderer tried to push, but later back out due to certain issues).
Anyway, thanks for the answer, now the video allow me to learn something new, specially the shallow/depth aspect of CPU/GPU.
What's great about the video is that now I know there's a personal render farm from BOXX (there was a personal render farm that certain renderer tried to push, but later back out due to certain issues).
Anyway, thanks for the answer, now the video allow me to learn something new, specially the shallow/depth aspect of CPU/GPU.
- fsimerey
- Member
- 279 posts
- Joined: Dec. 2009
- Offline
jeff
..Nvidia seems to be the best for all platforms at this given time with regards to drivers (Linux especially).
…
About Linux, which distribution are you recommend ?
A friend talk me about the sidefx courses in L.A. are done on Ubuntu (9.04 ? 10.04 ?, i don't know).
I have 2 options:
1- I have a big Mac Pro (8-cores HT - Nvidia GTX 285 - RAID Card 4 disks in raid 0 - 24GB Ram), but like all the time, Apple, video card & video drivers are still in the 90's. Mac OS X Lion will support OpenGL 3, unbelievable !
But, i'm not sure of the future of this system. Have a big iPad in place of a big workstation, for sure, i prefer the workstation.
2- I have an home made computer with Debian 5 (lenny), core i-7 950, ATI radeon 4890, 2x OCZ Vertex SSD (RAID 0) for system.
And i want buy a new graphic card. At the begining, i would buy a Quadro 4000 Mac edition to use a 3rd screen, but after test reading, Quadro 4000 seems equal than GTX 285 on 3D software. The only difference is 2GB for GPU ram.
So, i think, the best choice will be to run Houdini on linux with a Geforce GTX 560 or 580, but on Ubuntu (9.04 ? 10.04 ?) or i stay on my Debian or i continue to hope in Apple than a day, it will be possible to make big 3D on a mac ? (20 years of waiting… is looonng)
Thanks for your help !
- uniqueloginname
- Member
- 330 posts
- Joined: July 2005
- Offline
I say this as a Macbook owner; Mac desktops are terrible value for money. You would get so much more power if you built a PC and put Ubuntu on it.
I don't think the choice of distro matters so much (e.g. unless you have customised software using the HDK which may require particular versions of gcc).
I don't think the choice of distro matters so much (e.g. unless you have customised software using the HDK which may require particular versions of gcc).
-
- Quick Links