I whant to download the Apprentice Houdini for windows x64 if posible?
But cannot see it listed.
And some other questions i have, Is Houdini in the same boat as XSI is?
My only problem sofar with XSI has been that XSI does not fully integrate with Windows x64 yet! Python, Shaders need to be re-compiled etc.
Is it the same sittuation with Houdini? btw im considering to try more to get it to work with Suse 10 or Debian maby, but i still have Windows x64 installed so i would like to trye that allso.
Btw i have found windows xp x64 to be the moast stable of the windows series so far ;P if there is sutch a thing with win =P
Cheers
Any Houdini for Windows x64?
6960 10 5- daskog
- Member
- 13 posts
- Joined: Nov. 2005
- Offline
- JColdrick
- Member
- 4140 posts
- Joined: July 2005
- Offline
AFAIK no win64 compiles are being posted(or created) yet.
Well, not sure I'd call that lacking in integration…if something relies on running code compiled for 64 bit, then you'll need to ensure it is. It's a common issue with 64 bit mixed with 32 bit systems - you typically need to fork compiled code so that the right system uses the right files. However, I'm a little confused about the python and shader references…Python is text scripting language - I run the same scripts on 64 bit linux as on 32 bit…it's the executable that is important. Also, AFAIK compiled shaders should work just fine. It's typically stuff like home brew compiled dso's that give you grief, such as when using the Houdini Development Kit. Otherwise, all the VEX code works fine.
Mind you, perhaps you're talking about truly compiled shaders in XSI, that are compiled against the system libraries…not sure. In that case, yes, you need to manage forked code. That will never go away. However, don't forget you can run the 32 bit version of the app and all this is unecessary. Is there a particular reason you want 64 bit? I know everyone “wants it”, but have you measured any serious advantages? We haven't here, yet, although it's certainly the way things are going, no mistake.
Cheers,
J.C.
My only problem sofar with XSI has been that XSI does not fully integrate with Windows x64 yet! Python, Shaders need to be re-compiled etc.
Well, not sure I'd call that lacking in integration…if something relies on running code compiled for 64 bit, then you'll need to ensure it is. It's a common issue with 64 bit mixed with 32 bit systems - you typically need to fork compiled code so that the right system uses the right files. However, I'm a little confused about the python and shader references…Python is text scripting language - I run the same scripts on 64 bit linux as on 32 bit…it's the executable that is important. Also, AFAIK compiled shaders should work just fine. It's typically stuff like home brew compiled dso's that give you grief, such as when using the Houdini Development Kit. Otherwise, all the VEX code works fine.
Mind you, perhaps you're talking about truly compiled shaders in XSI, that are compiled against the system libraries…not sure. In that case, yes, you need to manage forked code. That will never go away. However, don't forget you can run the 32 bit version of the app and all this is unecessary. Is there a particular reason you want 64 bit? I know everyone “wants it”, but have you measured any serious advantages? We haven't here, yet, although it's certainly the way things are going, no mistake.
Cheers,
J.C.
John Coldrick
- sascha
- Member
- 158 posts
- Joined: July 2005
- Offline
- peliosis
- Member
- 175 posts
- Joined: July 2005
- Offline
Oh 4 gb is great! You can construct a scene so UNoptimized that it can hardly move in the viewport and rendering takes lightyears
Just kidding, but houdini really slows down with “complex” scenes, I built a complicated l-system tree of around 300k only and after reading it from geo, I'm almost unable to use edit SOP on it, lattice works like a frog in tar, horrible. SESI should have a look at xsi (in such particular cases).
Sorry for this kind of OT
Just kidding, but houdini really slows down with “complex” scenes, I built a complicated l-system tree of around 300k only and after reading it from geo, I'm almost unable to use edit SOP on it, lattice works like a frog in tar, horrible. SESI should have a look at xsi (in such particular cases).
Sorry for this kind of OT
- old_school
- Staff
- 2540 posts
- Joined: July 2005
- Offline
Moving objects around in a massive dataset: Not bad.
Moving a single point around in a half-million point dataset: very slow.
I just did a real quick simple bonehead test:
Sphere SOP (poly, frequency=20) Grid SOP (x=20,y=20)
\ /
\ /
Copy SOP
|
Edit SOP
Now try to move some points. Ouch.
With the same example in Soft, how fast is it to move points around?
Moving a single point around in a half-million point dataset: very slow.
I just did a real quick simple bonehead test:
Sphere SOP (poly, frequency=20) Grid SOP (x=20,y=20)
\ /
\ /
Copy SOP
|
Edit SOP
Now try to move some points. Ouch.
With the same example in Soft, how fast is it to move points around?
There's at least one school like the old school!
- JColdrick
- Member
- 4140 posts
- Joined: July 2005
- Offline
Nothing wrong with getting things optimized, but if I saw anyone trying to edit points in a 3.2 million primitive object interactively on *any* system, I'd tell them they're going about things the wrong way. In Houdini especially, there's a lot of ways to optimize the process, being completely procedural.
I'd be curious how XSI would *procedurally* edit points in something that large, too.
Cheers,
J.C.
I'd be curious how XSI would *procedurally* edit points in something that large, too.
Cheers,
J.C.
John Coldrick
- old_school
- Staff
- 2540 posts
- Joined: July 2005
- Offline
Yes John, but then again you for the most part create and modify your own data. You only represent a small percentage of Houdini users in this regard.
What about the majority of Houdini users that need to deal with datasets from other applications in a different department in a large facility? Sure it's expected that professional modellers will compartmentalize their models in such a way to make any process efficient. Is it generally done in practice? Not that often.
This means that at some point, you have to break down the geometry yourself. Putting different parts of the single monolithic model in to groups or objects and process those pieces instead of one big whole. RenderMan and Mantra will love you for this.
What makes it possible to get huge datasets? The other applications can deal with large datasets directly manipulating the points and primitives quicker leading to these huge monolithic datasets. You can't blame the modeller in this case. It obviously works if so many are now doing this. They expect Houdini to be able to break down models quickly.
As I said, Houdini seems to be up to speed when manipulating large datasets at the object level. It is the editing of geometry that sees a huge hit right now.
I would find it real interesting if the houdini community were to do some speed tests for us with different applications. Sounds like a task for OdForce. :wink:
What about the majority of Houdini users that need to deal with datasets from other applications in a different department in a large facility? Sure it's expected that professional modellers will compartmentalize their models in such a way to make any process efficient. Is it generally done in practice? Not that often.
This means that at some point, you have to break down the geometry yourself. Putting different parts of the single monolithic model in to groups or objects and process those pieces instead of one big whole. RenderMan and Mantra will love you for this.
What makes it possible to get huge datasets? The other applications can deal with large datasets directly manipulating the points and primitives quicker leading to these huge monolithic datasets. You can't blame the modeller in this case. It obviously works if so many are now doing this. They expect Houdini to be able to break down models quickly.
As I said, Houdini seems to be up to speed when manipulating large datasets at the object level. It is the editing of geometry that sees a huge hit right now.
I would find it real interesting if the houdini community were to do some speed tests for us with different applications. Sounds like a task for OdForce. :wink:
There's at least one school like the old school!
- JColdrick
- Member
- 4140 posts
- Joined: July 2005
- Offline
One word: proxies.
Honestly, Jeff, I can't say for certain, but our experience, even when importing data(and you're mistaken about our usage there - a huge percentage of “filler” data we use is imported - it's not economical to reinvent the wheel ), has been that most of them have *some* sort of structure, are grouped out in some manner, and the vast majority of the time can be broken out in a logical and manageable manner. About the only time I dion't see this is in a situation where you have a disagreeable client who grudgingly sends you crap data, or perhaps in scanned data(*shudder) which is the very definition of unoptimized(although almost all scanning package come with poly-optimizing proggies).
Having said that, I will never disagree with getting Houdini faster.
Houdini being procedural will always add a speed hit, I think. About the only solution to that is to optimize, optimize as much as possible, and possibly if enough users really have a need for it, create a solution “outside the procedural box” which is just a really fast poly-puller.
Cheers,
J.C.
Honestly, Jeff, I can't say for certain, but our experience, even when importing data(and you're mistaken about our usage there - a huge percentage of “filler” data we use is imported - it's not economical to reinvent the wheel ), has been that most of them have *some* sort of structure, are grouped out in some manner, and the vast majority of the time can be broken out in a logical and manageable manner. About the only time I dion't see this is in a situation where you have a disagreeable client who grudgingly sends you crap data, or perhaps in scanned data(*shudder) which is the very definition of unoptimized(although almost all scanning package come with poly-optimizing proggies).
Having said that, I will never disagree with getting Houdini faster.
Houdini being procedural will always add a speed hit, I think. About the only solution to that is to optimize, optimize as much as possible, and possibly if enough users really have a need for it, create a solution “outside the procedural box” which is just a really fast poly-puller.
Cheers,
J.C.
John Coldrick
- VisualCortexLab
- Member
- 509 posts
- Joined: July 2005
- Offline
sascha
To have more than 4 GB is sometimes nice.
4gb?… afaik 32bit limits is 2Gb. or maybe i'm misunderstanding and u're talking about 53bit and lack of machines which hosts 8gb?
I'm also considering Linux64 on my machine cause I'm really depressed when Houdini crashes (and it does) cause memory limit is reached..
cheers
JcN
VisualCortexLab Ltd :: www.visualcortexlab.com
VisualCortexLab Ltd :: www.visualcortexlab.com
- JColdrick
- Member
- 4140 posts
- Joined: July 2005
- Offline
- Fabian
- Member
- 127 posts
- Joined: July 2005
- Offline
Thats 4gb ram theoretically, but since some address space is used for components other than your ram in the system you can get to about 3gb-3.5gb ram being recognized by the os. And then in winxp for an app to use more than 2gb you have to launch win with the /3GB switch (in boot.ini).
As for python in xsi needing to be 64bit you have to remeber that xsi implements scripting languages via activex, softimage have made a 64bit compile of active python that works, at least it did after a few tries
As for python in xsi needing to be 64bit you have to remeber that xsi implements scripting languages via activex, softimage have made a 64bit compile of active python that works, at least it did after a few tries
-
- Quick Links