Python线程多个bash子进程?

如何使用线程和子进程模块来生成并行的bash进程?当我启动线程ala这里的第一个答案:python multithreading for dummies,bash进程顺序而不是并行运行。
您不需要并行运行子进程的线程:

from subprocess import Popen

commands = [
    'date; ls -l; sleep 1; date',
    'date; sleep 5; date',
    'date; df -h; sleep 3; date',
    'date; hostname; sleep 2; date',
    'date; uname -a; date',
]
# run in parallel
processes = [Popen(cmd, shell=True) for cmd in commands]
# do other things here..
# wait for completion
for p in processes: p.wait()

要限制并发命令的数量,您可以使用使用线程的multiprocessing.dummy.Pool,并提供与使用进程的multiprocessing.Pool相同的接口:

from functools import partial
from multiprocessing.dummy import Pool
from subprocess import call

pool = Pool(2) # two concurrent commands at a time
for i, returncode in enumerate(pool.imap(partial(call, shell=True), commands)):
    if returncode != 0:
       print("%d command failed: %d" % (i, returncode))

This answer demonstrates various techniques to limit number of concurrent subprocesses:它显示multiprocessing.Pool,concurrent.futures,线程基于队列的解决方案。

您可以限制并发子进程的数量,而不使用线程/进程池:

from subprocess import Popen
from itertools import islice

max_workers = 2  # no more than 2 concurrent processes
processes = (Popen(cmd, shell=True) for cmd in commands)
running_processes = list(islice(processes, max_workers))  # start new processes
while running_processes:
    for i, process in enumerate(running_processes):
        if process.poll() is not None:  # the process has finished
            running_processes[i] = next(processes, None)  # start new process
            if running_processes[i] is None: # no new processes
                del running_processes[i]
                break

在Unix上,您可以避免忙环和block on os.waitpid(-1, 0), to wait for any child process to exit

http://stackoverflow.com/questions/14533458/python-threading-multiple-bash-subprocesses

本站文章除注明转载外,均为本站原创或编译
转载请明显位置注明出处:Python线程多个bash子进程?