about XPU Houdin 20 and Mac Ultra M2

   9887   28   5
User Avatar
Member
37 posts
Joined: Feb. 2017
Offline
oldteapot7
IMO its waste of time and programmers effort and human resources in SESI.
I'm not sure SideFX are actually putting that much time into it? I would love it if they made the M-series a higher priority, I don't think ARM even got a mention during the H20 launch keynote.

I prefer using Houdini on Apple Silicon - I really only use a PC for lookdev, specifically RTX rendering performance. ARM chips are plenty powerful enough for 3D content creation nowadays.
User Avatar
Member
35 posts
Joined: Oct. 2015
Online
Imagine if developers said why should I develop for Intel chips back in the day. I'm sticking with the 68000, that's good enough.

Chips are getting smaller, getting faster and it is a progression. I don't think it's realistic that a laptop can outperform a dedicated workstation just yet. They will always be behind, but it's quite phenomenal that they are becoming so powerful after 2 generations IMO.

The M2 is as powerful as my 1950x that I type on. I can afford to run a sim for 20 mins or so on the 1950x, so I could afford a couple of mins extra on a laptop that can be taken anywhere. That's phenomenal IMO.

Rule of thumb, don't need it, don't buy it. But it may just enable more options if you need it.
Edited by Pixelised - Nov. 8, 2023 07:08:51
User Avatar
Member
119 posts
Joined: Aug. 2015
Offline
oldteapot7
IMO its waste of time and programmers effort and human resources in SESI.

People fall too much in Apple marketing and propaganda...

The type of 3D we usually do in Houdini with heavy simulations, instancing, rendering etc. isn`t what most or anyone does on laptops - yes, it`s ridiculous to use to much man power on such a tiny group of users.
Edited by Sygnum - Nov. 8, 2023 07:56:00
User Avatar
Member
37 posts
Joined: Feb. 2017
Offline
Sygnum
The type of 3D we usually do in Houdini with heavy simulations, instancing, rendering etc. isn`t what most or anyone does on laptops - ridiculous.
Theory Accelerated have done a brilliant job of optimising simulations on Mac GPU's using the Metal framework as a plugin for Houdini. That's the kind of innovation I'd like to see within Houdini itself. Axiom Solver can even run on iPad:

https://apps.apple.com/us/app/axiom-solver/id6443941087 [apps.apple.com]

It's incredible what can be achieved on highly portable, battery powered devices in 2023.
Edited by liberalarts - Nov. 8, 2023 08:10:22
User Avatar
Member
250 posts
Joined: May 2017
Offline
Sygnum
isn`t what most or anyone does on laptops - yes, it`s ridiculous to use to much man power on such a tiny group of users.
think of it that way - the amount of people who actually have macbook pros, mac studio max/ultra is in millions, but rendering happens on octane, redshift, and cycles - simply because of the speed. why say no to those people?

btw, Houdini is not only about simulating super large vfx. I've spent lots of time doing some moghraphey style ads like toasters flying around and letters looking cool - typical C4D bread and butter. You dont need infinite power there.
(also i'd wager M3max would outperform gen1 and gen2 threadrippers, so they are not exactly "toy" machines now are they)
https://twitter.com/oossoonngg [twitter.com]
User Avatar
Member
170 posts
Joined: Nov. 2015
Offline
oldteapot7
It looks like toy CPU
oldteapot7
So from definition it cant be powerfull
oldteapot7
but still its only ARM processor

I am sorry but to call out people for falling for Apple propaganda but clearly never having bothering to take a objective look at what Apple Silicon actually offers is quite a feat.
Edited by chf - Nov. 8, 2023 14:13:46
User Avatar
Member
33 posts
Joined: Jan. 2015
Offline
Wanted to jump in here and say that as a long time Mac user and using Houdini for 10 years, I would be super excited to see a native renderer (XPU) that took full use of the M(x) GPU. It's something that Redshift has offered for a few years now and it's fast and because of that I upgraded to the M1 Studio mac (as opposed going PC). I know Mac is a smaller market for CGI but when it works it really does work. The problem always seems to be that Mac falls behind on development schedule and sometimes off the cliff all together. Also, I have to say it - as iPhone is getting flack for monopolizing cell phone use, I often resent 'only works on NVIDIA hardware'. Going out on a limb here but I would drop my redshift subscription in a heartbeat if I could render in XPU at the same speed or better.
User Avatar
Member
50 posts
Joined: Aug. 2021
Offline
For what it's worth, I've noticed that sims can be computed extremely fast on my M3 Max MacBook Pro, often nearly as fast as my AMD 64 core threadripper. I wonder what an M4 Max will be able to accomplish with regards to grinding through sims -- now if we could just get Metal-based GPU Karma on Mac!
User Avatar
Member
10 posts
Joined: June 2020
Offline
Just ordered the M4Max with all the cores. Looking forward to trying it out, just waiting for that XPU support so I can ditch redshift.
  • Quick Links