Python's multiprocessing

   3558   9   2
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Hi,

I found that python multiprocessing is not starting from the Houdini UI but works from the hython.
For example, this small code snippet:
from multiprocessing import Process
from time import sleep

def foo():
    print("Processing is running")
    sleep(3)
    print("Done")
    
t=Process(target=foo)
t.start()
print("Is Process alive: ",t.is_alive())
t.join()

The same code works as expected (runs debug lines and process is alive) in the hython session but doesn't work in the Python Shell window from the running UI Houdini session.

Python threading works in both cases but in my circumstances, I'm using the multiprocessing approach, and at least would be nice to know why it doesn't work from the Houdini UI (H19.0.546).
User Avatar
Member
2630 posts
Joined: June 2008
Offline
Weird, when I place that code into the Python Window, another copy of Houdini is launched, that reports some multi-processing error. H 19.0498.
Edited by Enivob - March 23, 2022 08:51:32
Using Houdini Indie 20.0
Windows 11 64GB Ryzen 16 core.
nVidia 3050RTX 8BG RAM.
User Avatar
Member
209 posts
Joined: Nov. 2010
Offline
Can you please provide the error you got?
User Avatar
Member
9 posts
Joined: Aug. 2014
Offline
I'm trying to do multi-processing to submit to deadline. My error when running your code says "Load failed for C:/Windows/System32/6344"
And it launched a new houdini session said ('Is Process alive: ', True)
after I closed out of the new houdini.
From reading the docs, I think you need to execute it in __main__ not in session.
and say if__name__ is '__main__':
# Then make the process

Where __main__ has to be executed in the python shell unfortunately. If you figure out how to put hda code for example into __main__ that would be real helpful so I'll know how to do process checking before submitting to deadline
Edited by Tyler Strasser2 - June 2, 2022 17:01:05
User Avatar
Member
23 posts
Joined: March 2020
Offline
Just picking this up again has anyone figured this out yet?

I'd love to multiprocess a task inside a Python LOP and can't get it to work. the same code that works in pycharm is throwing errors inside Houdini

import concurrent.futures

def run_multiprocessed():
    with concurrent.futures.ProcessPoolExecutor() as executor:
        ls = range(10)
        procs = [executor.submit(pow, i, 2) for i in ls]

        for k, v in zip(ls, procs):
            print(k, v.result())

            
    
if __name__ == "builtins":
    run_multiprocessed()

This code ran in a shelf button throws this error: concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.

running it in a python lop opens a dialog for a crash file that has been saved even though my Houdini hasn't crashed. I reccon it's the other process that's spawned by multiprocessing that's crashing and causing the crash message.

Trying to keep it simpler and sticking with the multiprocessing module my entire houdini freezes up not coming back.

from multiprocessing import Process, Queue

def pow_it(q, i, p):
    q.put(pow(i, p))


def run_multiprocessed():
    q = Queue()

    ls = range(10)
    procs = [Process(target=pow_it, args=(q, i, 2)) for i in ls]

    for k, v in zip(ls, procs):
        v.start()
        print(k, q.get())
        v.join()


run_multiprocessed()

How can I do multiprocessing inside python in Houdini?
User Avatar
Member
23 posts
Joined: March 2020
Offline
Alright so I at least got it to run without an error with this

import concurrent.futures
import multiprocessing

def run_multiprocessed():
    with concurrent.futures.ProcessPoolExecutor(mp_context = multiprocessing.get_context("spawn")) as executor:
        ls = range(10)
        procs = [executor.submit(pow, i, 2) for i in ls]

        for k, v in zip(ls, procs):
            print(k, v.result())

            
    
if __name__ == "builtins":
    run_multiprocessed()

now I want to pass a custom function ofc but I can't get that to work
import concurrent.futures
import multiprocessing

def pow_it(a, b):
    return pow(a, b)

def run_multiprocessed():
    with concurrent.futures.ProcessPoolExecutor(mp_context = multiprocessing.get_context("spawn")) as executor:
        ls = range(10)
        procs = [executor.submit(pow_it, i, 2) for i in ls]

        for k, v in zip(ls, procs):
            print(k, v.result())

            
    
if __name__ == "builtins":
    run_multiprocessed()

edit: to clarify what's not working I'm getting a pickle error: _pickle.PicklingError: Can't pickle <function power_it at 0x7fd70adfcb90>: attribute lookup power_it on __main__ failed
Edited by florens - Feb. 29, 2024 02:30:46
User Avatar
Member
23 posts
Joined: March 2020
Offline
Alright that working pow example got me on the right track (somehow). If I save my multiprocessed script inside a directory that is known by sys.path and then simply reference it as a module everything works as expected. Ofc that makes on the fly scripts a touch annoying but you could argue when you dive into multiprocessing that that should be a piped process.

edit: just to add you don't even have to set the multiprocessing context to spawn this way. the default fork works just fine.

edit2: I found that you do actually need to set the context to spawn. for whatever reason if you leave it as fork the first process will always run fine but if you run the process again and have a python panel open it will error. if you don't have a python panel open it still works. Just setting the context to spawn fixed that.
Edited by florens - March 14, 2024 02:05:18
User Avatar
Member
2 posts
Joined: Oct. 2022
Offline
Ive been looking at trying to multiprocess myself and I’m a little confused how if __name__ == "builtins": is working? Since when attempting myself it throws the same errors I’d expect since it’s not running from main.

The way I’ve gotten around it is using subprocess and pickle to run the multi processing through the Python interpretare, passing it the function and args from within houdini. It works but feels a bit messy.

Another problem with doing it this way is that there’s no way to pass it any hou based functions since it’s no longer running from the current session.

Would be great to get a bit more detail
User Avatar
Member
621 posts
Joined: Aug. 2008
Offline
very interesting multiprocessing multicore examples not just for Houdini, in general
User Avatar
Member
14 posts
Joined: Jan. 2017
Offline
If Houdini's python version doesn't have the global interpreter lock (GIL) disabled yet during build (I think they're officially planning to get rid of it in 3.13), python itself can't have multiple threads interpreting bytecode. Aside from interpreted python being slower (the interpreter doesn't come anywhere near something like .NET in terms of speed), this is why most of the libraries that do heavy multiprocessing like NumPy, Torch, etc have the majority of those parts written in C++ which has thread safe memory management functions, or JIT the python to whatever the target is like torch does with all backends including CPU AFAIK. Apparently 3.10 can be built without it and everything generally works but since it was a major change they have been testing it pretty heavily and probably adding some support functionality before turning it off by default. IronPython and Jython were the only interpreters not affected by this, and CPython's C-translated code isn't affected. See:
https://wiki.python.org/moin/GlobalInterpreterLock [wiki.python.org]
  • Quick Links