From subprocess.Popen to multiprocessing

Question:

I got a function that invokes a process using subprocess.Popen in the following way:

    def func():
        ...
        process = subprocess.Popen(substr, shell=True, stdout=subprocess.PIPE)

        timeout = {"value": False}
        timer = Timer(timeout_sec, kill_proc, [process, timeout])
        timer.start()

        for line in process.stdout:
            lines.append(line)

        timer.cancel()
        if timeout["value"] == True:
            return 0
        ...

I call this function from other function using a loop (e.g from range(1,100) ) , how can I make multiple calls to the function with multiprocessing? that each time several processes will run in parallel

The processes doesn’t depend on each other, the only constraint is that each process would be ‘working’ on only one index (e.g no two processes will work on index 1)

Thanks for your help

Asked By: Dor Cohen

||

Answers:

Just add the index to your Popen call and create a worker pool with as many CPU cores you have available.

import multiprocessing

def func(index):
    ....
    process = subprocess.Popen(substr + " --index {}".format(index), shell=True, stdout=subprocess.PIPE)
    ....

if __name__ == '__main__':
    p = multiprocessing.Pool(multiprocessing.cpu_count())
    p.map(func, range(1, 100))
Answered By: Maximilian Peters
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.