Request: Guide to setting up a simple render farm for us Indie users

   4571   2   1
User Avatar
Member
146 posts
Joined: Sept. 2011
Offline
I've got multiple machines - Macs and PCs - but I've struggled to understand the hQueue docs, and a step-by-step guide for one-man-bands and small studios would be ace.

This was very simple with other 3D packages (just load the same scene on all machines, point them all at the same output folder, have them all skip existing frames, set them all going) but Mantra only seems to check for existing frames when you hit Go, and not after each frame is complete, so the machines all end up rendering the same frames as each other.
User Avatar
Member
258 posts
Joined:
Offline
Hqueue is actually really easy set up but does require some networking knowledge, and if the idea of easy is that you can copy stuff to individual machines and point them to the same output folder and render than my guess is that you might need little help and should watch some networking tutorials. At the very least you need one machine as the server, other machines set up as clients, they all need to be able to see the same network share, and from what I remember there are some basic host and port setting. Opening Hqueue is as simple as opening it in a browser…I have seen some janky hqueue set-ups that work but generally, and every time I have set it up I have at a minimum…

1. Had a network that was controlled by a server with domain control
2. Machines set with static IP addresses
3. Properly shared network directories
2. Consistent naming across machines, this is not necessary but is a good idea
User Avatar
Member
146 posts
Joined: Sept. 2011
Offline
Hehe - well, my old approach was simplistic but it wasn't quite as crude as copying files to all the machines

With project files / textures / render bins all on network drives, setting the whole “farm” going was as simple as VNCing into each machine, double-clicking on the project file and hitting “Render”.

I'm not sure I have a “server with domain control”; I've got a bunch of Macs and PCs with static IPs, access to network folders on a NAS, consistent naming, though. I make good use of $JOB and $HSITE so it doesn't matter which machine I'm working on, they can all access the same assets (OTLs / maps etc) the same way.

Can't quite get my head around why, on a heterogenous network (Macs and PCs), they need to be set up to run Houdini from a shared folder as opposed to a nice speedy local SSD, though. I'm guessing it's down to file system differences. You'd think each client would be able to use its own copy of Houdini to service jobs.

I'll have another go. Have to be a bit careful I don't sink too much time into it, gotta actually do some work too. Thanks for your advice
  • Quick Links