python-multiprocessing

How to limit the number or parallel executions using Python multiprocessing – Queue(), Manager(), Pool()

How to limit the number or parallel executions using Python multiprocessing – Queue(), Manager(), Pool() Question: Hej all, I am struggling to limit the number of parallel executions in the below Python code using multiprocessing – in particular Queue(), Manager() and Pool(). My understand was that multiprocessing.Pool(processes=2) would result in two workers running in parallel …

Total answers: 1

Parallel while loop with unknown number of calls

Parallel while loop with unknown number of calls Question: I have written a function that does some calculations, and that everytime it is called it returns a different result, since it uses a different seed for the random number generator. In general, I want to run this functions many times, in order to obtain many …

Total answers: 1

Does python multiprocessing work in Visual Studio Code?

Does python multiprocessing work in Visual Studio Code? Question: I’m having a problem where VS Code returns an error when I run my multiprocessing code. I’m pretty new to multiprocessing and VS Code, so I am completely in the dark on what to do to fix this error. I’ve looked up numerous tutorials on how …

Total answers: 1

In Python ProcessPoolExecutor, do you need call shutdown after getting a BrokenProcessPool exception?

In Python ProcessPoolExecutor, do you need call shutdown after getting a BrokenProcessPool exception? Question: In Python ProcessPoolExecutor, do you need call shutdown after getting a BrokenProcessPool exception? Say I have something like this: pool = ProcessPoolExecutor(max_workers=1) try: return pool.submit( do_something, ).result() except concurrent.futures.process.BrokenProcessPool as pool_exc: pool = None return None Is it a bad idea …

Total answers: 1

Celery: Spawn "sidecar" webserver process

Celery: Spawn "sidecar" webserver process Question: I’m trying to collect metrics from my Celery workers, which seemed simply enough, but turns out to be utterly, ridiculously hard. After lots of approaches, I’m now trying to spawn an additional process next to the Celery worker/supervisor that hosts a simple HTTP server to expose Prometheus metrics. To …

Total answers: 2

MPIRE WorkerPool causes memory leak

MPIRE WorkerPool causes memory leak Question: I have a python module with a function that runs in an infinite loop. Within this function I create WorkerPool with the mpire library. I cannot use the standard multiprocessing library because I have to pass non-picklable objects to the worker functions (at least I have not yet found …

Total answers: 1

Job queue with dependencies with python multiprocessing

Job queue with dependencies with python multiprocessing Question: I have a function and a list of jobs: jobs = [[(2, ‘dog’), None], [(-1, ‘cat’), (0,)], [(-1, ‘Bob’), (1,)], [(7, ‘Alice’), None], [(0, ‘spam’), (2,3)]] I would like to apply the function to the arguments (first tuple) in parallel, while satisfying the dependencies on previous jobs …

Total answers: 1

Why does `Queue.put` seem to be faster at pickling a numpy array than actual pickle?

Why does multiprocessing.Queue.put() seem faster at pickling a numpy array than actual pickle? Question: It appears that I can call q.put 1000 times in under 2.5ms. How is that possible when just pickling that very same array 1000 times takes over 2 seconds? >>> a = np.random.rand(1024,1024) >>> q = Queue() >>> timeit.timeit(lambda: q.put(a), number=1000) …

Total answers: 1