Hello Wizards!
I'm in a great dilemma if I should buy the just announced Radeon VII or an rtx 2070 / 2080. Due to my budget range it seems a great opportunity to grab 16gig Vram for the price of RADEON VII, also I read not bad articles about Vega 64 and its OpenGL capacity (RADEON VII will be an upgraded Vega if I get it right)
Do any of you had some experiences with Vega GPUs? I'm planning to stick with Houdini as my main 3D platform, since version 17 I have been seeing it even as a great modeling tool.
My software pack:
- Houdini
- Maya, 3d Max, Blender
- Zbrush, Mudbox
- Substance (designer, painter)
- PS
- Davinci Resolve
what I want to focus on:
- Zbrush, Substance(designer, painter), HOUDINI, Blender, (PS, Davinci resolve ofc.)
But mostly I wanna focus on Houdini - modeling, simulations, UV'ing, retopo, rigging, animating all in Houdini.
It seems a bit off topic, that I mentioned my software pack, but I can imagine that it could influence suggestions.
- I know I could benefit by using radeon VII's memory
- I do not know how much I would lose by the absolute lack of CUDA cores
- I have no clues, how AMD's drivers work with Houdini
- if I take a look at officially supported GPU list by SideFX, I can see r9, fury 390, 480 a bunch of fire pro, a radeon pro, but I do not see any Vega GPUs. Is there a reason, or they are just not tested officially?
Thanks if anyone can answer who had personal experiences in this question.
(eddited-grammar)
AMD GPUs? Experiences?
17582 25 2- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
- filipw
- Member
- 138 posts
- Joined: 3月 2018
- Offline
I use 2 vega64 cards in eGPUs on my MacBookPro. Viewport in Blender and Houdini works good/great as far as I'm concerned and a massive differnece to the built in graphics at least.
It should be even better in Windows where drivers are allowing more features.
I don't have actual performance numbers and if a 2080 would be better so it would be interesing to hear about that.
Potentially the vega vii will be very good for simulations and it is probably awesome for video work. But how knows Buy one and report back please!
It should be even better in Windows where drivers are allowing more features.
I don't have actual performance numbers and if a 2080 would be better so it would be interesing to hear about that.
Potentially the vega vii will be very good for simulations and it is probably awesome for video work. But how knows Buy one and report back please!
- Houdini Obsession
- Member
- 58 posts
- Joined: 11月 2014
- Offline
tardigrad3I suggest you to ask the same question/doubt in the report Bug/ref for support from the developers or customer support team.
Hello Wizards!
I'm in a great dilemma if I should buy the just announced Radeon VII or an rtx 2070 / 2080. Due to my budget range it seems a great opportunity to grab 16gig Vram for the price of RADEON VII, also I read not bad articles about Vega 64 and its OpenGL capacity (RADEON VII will be an upgraded Vega if I get it right)
Do any of you had some experiences with Vega GPUs? I'm planning to stick with Houdini as my main 3D platform, since version 17 I have been seeing it even as a great modeling tool.
My software pack:
- Houdini
- Maya, 3d Max, Blender
- Zbrush, Mudbox
- Substance (designer, painter)
- PS
- Davinci Resolve
what I want to focus on:
- Zbrush, Substance(designer, painter), HOUDINI, Blender, (PS, Davinci resolve ofc.)
But mostly I wanna focus on Houdini - modeling, simulations, UV'ing, retopo, rigging, animating all in Houdini.
It seems a bit off topic, that I mentioned my software pack, but I can imagine that it could influence suggestions.
- I know I could benefit by using radeon VII's memory
- I do not know how much I would lose by the absolute lack of CUDA cores
- I have no clues, how AMD's drivers work with Houdini
- if I take a look at officially supported GPU list by SideFX, I can see r9, fury 390, 480 a bunch of fire pro, a radeon pro, but I do not see any Vega GPUs. Is there a reason, or they are just not tested officially?
Thanks if anyone can answer who had personal experiences in this question.
(eddited-grammar)
they are fast and quick in helping.
Houdini Fx Artist (Build)
- anon_user_37409885
- Member
- 4189 posts
- Joined: 6月 2012
- Offline
tardigrad3
- I do not know how much I would lose by the absolute lack of CUDA cores
You mainly lose in rendering. Houdini for the most part is graphics card agnostic, though, H17 Mantra has Optix denoising that only runs on Cuda. Most people use Redshift to speed up rendering and that also wont run on AMD hardware. That's a pretty big issue to not to be able to speed up your rendering workflow when Mantra is simply too slow.
Everything is pointing to the fact that H17.5 will not speed up Mantra with the exception of, maybe, Intel's Embree tech, which uses the CPU to speed up raytracing.
- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
nAgApAvAntardigrad3I suggest you to ask the same question/doubt in the report Bug/ref for support from the developers or customer support team.
Hello Wizards!
I'm in a great dilemma if I should buy the just announced Radeon VII or an rtx 2070 / 2080. Due to my budget range it seems a great opportunity to grab 16gig Vram for the price of RADEON VII, also I read not bad articles about Vega 64 and its OpenGL capacity (RADEON VII will be an upgraded Vega if I get it right)
Do any of you had some experiences with Vega GPUs? I'm planning to stick with Houdini as my main 3D platform, since version 17 I have been seeing it even as a great modeling tool.
My software pack:
- Houdini
- Maya, 3d Max, Blender
- Zbrush, Mudbox
- Substance (designer, painter)
- PS
- Davinci Resolve
what I want to focus on:
- Zbrush, Substance(designer, painter), HOUDINI, Blender, (PS, Davinci resolve ofc.)
But mostly I wanna focus on Houdini - modeling, simulations, UV'ing, retopo, rigging, animating all in Houdini.
It seems a bit off topic, that I mentioned my software pack, but I can imagine that it could influence suggestions.
- I know I could benefit by using radeon VII's memory
- I do not know how much I would lose by the absolute lack of CUDA cores
- I have no clues, how AMD's drivers work with Houdini
- if I take a look at officially supported GPU list by SideFX, I can see r9, fury 390, 480 a bunch of fire pro, a radeon pro, but I do not see any Vega GPUs. Is there a reason, or they are just not tested officially?
Thanks if anyone can answer who had personal experiences in this question.
(eddited-grammar)
they are fast and quick in helping.
Thanks so much for your suggestion, I will do like that!
- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
fuostardigrad3
- I do not know how much I would lose by the absolute lack of CUDA cores
You mainly lose in rendering. Houdini for the most part is graphics card agnostic, though, H17 Mantra has Optix denoising that only runs on Cuda. Most people use Redshift to speed up rendering and that also wont run on AMD hardware. That's a pretty big issue to not to be able to speed up your rendering workflow when Mantra is simply too slow.
Everything is pointing to the fact that H17.5 will not speed up Mantra with the exception of, maybe, Intel's Embree tech, which uses the CPU to speed up raytracing.
Hello Fuos!
Thanks for your answer! I had no clue that Mantra is using CUDA cores for denoising - thanks for the info! In this case it's seems not that logical to go for the Radeon (I thought that Mantra uses just CPU) I know about Redshift and Octane (Octane was my favorite GPU render, so yea, if I take the green road again, I will end up there)
It seems that a lot points to the direction of an RTX card - I'm a bit sad about it, I doubt that I will be able to reach 16 gig Vram in the near future beside Radeon VII.
Unfortunately AMD ProRender can not much to do with Houdini simulations, I'm not sure if it even can handle Volumes or Particles (I'm not even sure, if it has a ProRender To Houdini plugin)Same due to LuxRender / LuxCore, I doubt that they even have a Houdini plugin.. I will google after some info, if it's posible to render Houdini simulations at LuxCore standalone.
I wanted to build up a full AMD workstation, but more I gather infos due to my software pack, it turns out that I will end up by buying a 9900k with an RTX card. To be honest I start to feel sorry about AMD
- sanostol
- Member
- 577 posts
- Joined: 11月 2005
- Offline
- anon_user_37409885
- Member
- 4189 posts
- Joined: 6月 2012
- Offline
- sanostol
- Member
- 577 posts
- Joined: 11月 2005
- Offline
- habernir
- Member
- 94 posts
- Joined:
- Offline
tardigrad3seems not that logical to go for radeon?fuostardigrad3
- I do not know how much I would lose by the absolute lack of CUDA cores
You mainly lose in rendering. Houdini for the most part is graphics card agnostic, though, H17 Mantra has Optix denoising that only runs on Cuda. Most people use Redshift to speed up rendering and that also wont run on AMD hardware. That's a pretty big issue to not to be able to speed up your rendering workflow when Mantra is simply too slow.
Everything is pointing to the fact that H17.5 will not speed up Mantra with the exception of, maybe, Intel's Embree tech, which uses the CPU to speed up raytracing.
Hello Fuos!
Thanks for your answer! I had no clue that Mantra is using CUDA cores for denoising - thanks for the info! In this case it's seems not that logical to go for the Radeon (I thought that Mantra uses just CPU) I know about Redshift and Octane (Octane was my favorite GPU render, so yea, if I take the green road again, I will end up there)
It seems that a lot points to the direction of an RTX card - I'm a bit sad about it, I doubt that I will be able to reach 16 gig Vram in the near future beside Radeon VII.
Unfortunately AMD ProRender can not much to do with Houdini simulations, I'm not sure if it even can handle Volumes or Particles (I'm not even sure, if it has a ProRender To Houdini plugin)Same due to LuxRender / LuxCore, I doubt that they even have a Houdini plugin.. I will google after some info, if it's posible to render Houdini simulations at LuxCore standalone.
I wanted to build up a full AMD workstation, but more I gather infos due to my software pack, it turns out that I will end up by buying a 9900k with an RTX card. To be honest I start to feel sorry about AMD
look at this –> https://www.youtube.com/watch?v=xVCE09flu94 [www.youtube.com]
just to remind you mentra isn't GPU based renderer and its only matter of time until houdini will have opencl renderer (i hope)
- anon_user_37409885
- Member
- 4189 posts
- Joined: 6月 2012
- Offline
sanostol
aha, that's cool to know,thanks for the tip.
how does it work on animation? what happens on pure cpu base rendernodes
Optix works really well, it looks like an organic Jpeg. CPU-only renders just skip the filter. Interestingly we still find Neatnoise to be invaluable, as Optix will not fix all noise, and it has amazing interframe technology that optix lacks. It's a good combination.
- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
habernirtardigrad3seems not that logical to go for radeon?fuostardigrad3
- I do not know how much I would lose by the absolute lack of CUDA cores
You mainly lose in rendering. Houdini for the most part is graphics card agnostic, though, H17 Mantra has Optix denoising that only runs on Cuda. Most people use Redshift to speed up rendering and that also wont run on AMD hardware. That's a pretty big issue to not to be able to speed up your rendering workflow when Mantra is simply too slow.
Everything is pointing to the fact that H17.5 will not speed up Mantra with the exception of, maybe, Intel's Embree tech, which uses the CPU to speed up raytracing.
Hello Fuos!
Thanks for your answer! I had no clue that Mantra is using CUDA cores for denoising - thanks for the info! In this case it's seems not that logical to go for the Radeon (I thought that Mantra uses just CPU) I know about Redshift and Octane (Octane was my favorite GPU render, so yea, if I take the green road again, I will end up there)
It seems that a lot points to the direction of an RTX card - I'm a bit sad about it, I doubt that I will be able to reach 16 gig Vram in the near future beside Radeon VII.
Unfortunately AMD ProRender can not much to do with Houdini simulations, I'm not sure if it even can handle Volumes or Particles (I'm not even sure, if it has a ProRender To Houdini plugin)Same due to LuxRender / LuxCore, I doubt that they even have a Houdini plugin.. I will google after some info, if it's posible to render Houdini simulations at LuxCore standalone.
I wanted to build up a full AMD workstation, but more I gather infos due to my software pack, it turns out that I will end up by buying a 9900k with an RTX card. To be honest I start to feel sorry about AMD
look at this –> https://www.youtube.com/watch?v=xVCE09flu94 [www.youtube.com]
just to remind you mentra isn't GPU based renderer and its only matter of time until houdini will have opencl renderer (i hope)
Yes I know that Mantra is a CPU based render.
Thanks for the youtube link, yes, I saw that test befor. Funny, that it seems like as the phase: “nomen est omen” forks not just due to names but also colors pyro - threadripper, pyro - radeon
I wish you would be right with an openCL based Mantra, but I did not heard not even rumors about it…and Radeon VII will arrive in a week, so if I wanna buy it, I have to see clear a viable solutions how I will use it up in my pipelines with some benefits vs an Nvidia GPU. Houdini has redshift(Nvidia) has Octane(Nvidia) has Arnold(most likely will be just CUDA based due to their last demo show) So basically there is no GPU based render plugin for Houdini that can utilize OpenCL.
And now even Apple messed up the scene with their “Metal” API.
Whats up with Radeon ProRender we should ask… well… thats a mistery…that would be AMDs chance to set up a competition in the GPU market due to content creator segments but they messed it up so far.
- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
fuos
Optix denoiser
And another Nvidia tech…poor amd
Also, I just read a quite hidden line that was implented with v17:
“Miscellaneous changes: improved Unified Noise, new Python API for custom viewport interaction
Other changes include updates to the Unified Noise VOP, which gets a new Periodic Noise system, plus initial 64-bit support and support for AVX.”
well AVX can make a huge different in threadripper vs coreX playfield:
https://www.anandtech.com/bench/product/2265?vs=2283 [www.anandtech.com]
A bit it seems that AMD do well when it's about horse power, but when they need to back it up with tech, they can not counter intel or Nvidia.
- habernir
- Member
- 94 posts
- Joined:
- Offline
i think that sidefx will never be committed to one company (NVIDIA) soo if they will develop GPU renderer or GPU dynamic node based the most chances it will be opencl and not CUDA .
amd publish that Radeon VII it's not just for gaming also for DCC software and what amd said its that they have 30% better performance in DCC soo i don't know how its impact on houdini and if thats true .
don't forget you can export your project to other software (if you are planning do to that) and render in other software
and redshift are developing also in opencl (this is what they said but it will take time)
but if you want to use redshift today soo if you have the money go for 2080ti or 2080.
amd publish that Radeon VII it's not just for gaming also for DCC software and what amd said its that they have 30% better performance in DCC soo i don't know how its impact on houdini and if thats true .
don't forget you can export your project to other software (if you are planning do to that) and render in other software
and redshift are developing also in opencl (this is what they said but it will take time)
but if you want to use redshift today soo if you have the money go for 2080ti or 2080.
Edited by habernir - 2019年1月24日 09:20:53
- anon_user_37409885
- Member
- 4189 posts
- Joined: 6月 2012
- Offline
- Charles Kirk
- Member
- 38 posts
- Joined: 8月 2011
- Offline
tardigrad3
https://www.anandtech.com/bench/product/2265?vs=2283 [www.anandtech.com]
I am considering building an AMD TR2 workstation based on the 2950X and have done quite a bit of research into this.
In your Intel AMD comparison the AMD Threadripper CPU is almost half the price of the Intel CPU, so it is not a fair comparison at least for considering the price performance ratio.
A better comparison would be:
Anandtech CPU 2019 Benchmarks [www.anandtech.com]
Overall the AMD Threadripper CPUs have a much better price performance ratio than the Intel CPUs.
On the GPU side the current generation of AMD GPUs (Polaris and Vega) are much less power efficient and generate a lot more heat and noise than the NVidia 10xxGTX and RTX 20xx series.
They do however come in somewhat cheaper.
There is currently a glut of GPUs resulting from the collapse in demand for GPUs for mining Crypto and the launch of the new NVidia RTX series, so it is possible to get the NVidia 10xxGTX series GPUs at very attractive prices.
By the time the software supports the Ray Tracing features of the NVidia RTX series NVidia will probably have released upgraded GPUs which may well have 16GB.
Edited by Charles Kirk - 2019年1月25日 09:35:09
- tardigrad3
- Member
- 19 posts
- Joined: 10月 2018
- Offline
penboacktardigrad3
https://www.anandtech.com/bench/product/2265?vs=2283 [www.anandtech.com]
I am considering building an AMD TR2 workstation based on the 2950X and have done quite a bit of research into this.
In your Intel AMD comparison the AMD Threadripper CPU is almost half the price of the Intel CPU, so it is not a fair comparison at least for considering the price performance ratio.
A better comparison would be:
Anandtech CPU 2019 Benchmarks [www.anandtech.com]
Overall the AMD Threadripper CPUs have a much better price performance ratio than the Intel CPUs.
On the GPU side the current generation of AMD GPUs (Polaris and Vega) are much less power efficient and generate a lot more heat and noise than the NVidia 10xxGTX and RTX 20xx series.
They do however come in somewhat cheaper.
There is currently a glut of GPUs resulting from the collapse in demand for GPUs for mining Crypto and the launch of the new NVidia RTX series, so it is possible to get the NVidia 10xxGTX series GPUs at very attractive prices.
By the time the software supports the Ray Tracing features of the NVidia RTX series NVidia will probably have released upgraded GPUs which may well have 16GB.
Yes, you are right, with power and heat due to AMD Vega. But still.. it's such an opportunity to buy a 16 gig 1T/sec memory GPU for that price With such a great computing power! I'm constantly reading after AMD's ProRender and trying to convince myself that actually it works well and rdy to use (but of course I know it well, that it's not even close to Octane or Redshift)
I 100% agree that Threadrippers are the best price/performance option in a workstation! No doubt about it due to the surreal crazy price range of Intel's X CPUs.
But, one thing is important and I'm honestly crossing my finggers, that AMD will handle it by the 3000 series, that they perform way under intel if it's about single core workflow. Actually every even extrude, move, rotate, scale, bevel, champfer…. first goes true a single CPU core and just than it reach the GPU that will send it to our display. So until active modeling we still need that good single core performance. And I do not even mentioning other softwares as Zbrush (I use a lot) that no matter if it greatly support multi threaded workflow in a superior way, a TR 1950 was able to reach the same Zbench score as a 6 year old i5.
That was said. But I also should share, that I'm absolutelly aiming for a TR 3000, probably I will have no money to buy one (they will be close to intel x at their arrival imo, more cores, but 1500+ pricerange) but I will wait just for it's new gen motherboards and drop a 1920 in it that I can upgrade by time.
Thanks so much by sending that link, you are right, if we compair these CPUs due to the same pricerange, than thats what we get, but careful there, AT's most benchmark line is multithread based, so it will show you a fake result compareing a 12 core cpu to a 16 core one (24 thread vs 32 thread). Any way, your decision is great! 2950 is a superior CPU! By the way, is it urgent? Can you wait for the new gen motherboards? Even if you will drop a 2950 in it, I heard that new TR motherboards will be more than a simple PCI 4 upgrade - worth waiting imo.
- Charles Kirk
- Member
- 38 posts
- Joined: 8月 2011
- Offline
Moving from my current systems, two laptops, one based on an Intel 2nd Generation i7 4C/8T CPU running Windows 10 and the other based on an Intel 3rd Generation i7 4C/8T mbp should give me around 5 times better Multi-Core performance.
I don't find that single core performance has much impact on my workflows, so that not a main consideration for me.
The main bottlenecks in my current workflow are making test renders in Planetside Terragen where better Multi-Core performance would greatly speed up the process by reducing test render times by around a factor of 5 and working in Unreal Engine.
I don't use ZBrush, so I can't comment on ZBRush performance on TR, I have evaluated it in the past and didn't get on with it, I'd probably look at 3DCoat if I wanted to do sculpting.
In general I only make decisions based on hardware that has been released and where test results are available with software that I use or software that I would anticipate to have similar performance to software that I use, so I tend to ignore all the rumours!
I don't find that single core performance has much impact on my workflows, so that not a main consideration for me.
The main bottlenecks in my current workflow are making test renders in Planetside Terragen where better Multi-Core performance would greatly speed up the process by reducing test render times by around a factor of 5 and working in Unreal Engine.
I don't use ZBrush, so I can't comment on ZBRush performance on TR, I have evaluated it in the past and didn't get on with it, I'd probably look at 3DCoat if I wanted to do sculpting.
In general I only make decisions based on hardware that has been released and where test results are available with software that I use or software that I would anticipate to have similar performance to software that I use, so I tend to ignore all the rumours!
- bsavery
- Member
- 8 posts
- Joined: 6月 2018
- Offline
Hi Tardigrad (and others).
I should mention in full disclosure I work for AMD on the ProRender software. The fact that we're looking at this forum should give you a hint. There is not a Houdini to ProRender plugin currently but we are looking to enable the workflow through USD. We've had a USD-> ProRender plugin for a while https://www.amd.com/en/support/kb/release-notes/rn-prorender-usd-v0-8-5 [www.amd.com]
Also WRT particles and volumes, we actually can do those quite well.
For users who are using the USD workflow in Houdini and interested in testing the gpu renderer, we'd love to talk and get testers!
Brian
I should mention in full disclosure I work for AMD on the ProRender software. The fact that we're looking at this forum should give you a hint. There is not a Houdini to ProRender plugin currently but we are looking to enable the workflow through USD. We've had a USD-> ProRender plugin for a while https://www.amd.com/en/support/kb/release-notes/rn-prorender-usd-v0-8-5 [www.amd.com]
Also WRT particles and volumes, we actually can do those quite well.
For users who are using the USD workflow in Houdini and interested in testing the gpu renderer, we'd love to talk and get testers!
Brian
- mandrake0
- Member
- 644 posts
- Joined: 6月 2006
- Offline
-
- Quick Links