非阻塞 subprocess.call
问题描述
我正在尝试进行非阻塞子进程调用以从我的 main.py 程序运行 slave.py 脚本.我需要将 args 从 main.py 传递给 slave.py 一次,当它(slave.py)第一次通过 subprocess.call 启动时,这个 slave.py 运行一段时间然后退出.
I'm trying to make a non blocking subprocess call to run a slave.py script from my main.py program. I need to pass args from main.py to slave.py once when it(slave.py) is first started via subprocess.call after this slave.py runs for a period of time then exits.
main.py
for insert, (list) in enumerate(list, start =1):
sys.args = [list]
subprocess.call(["python", "slave.py", sys.args], shell = True)
{loop through program and do more stuff..}
还有我的奴隶脚本
slave.py
print sys.args
while True:
{do stuff with args in loop till finished}
time.sleep(30)
目前,slave.py 阻止 main.py 运行其其余任务,我只是希望 slave.py 独立于 main.py,一旦我将 args 传递给它.这两个脚本不再需要通信.
Currently, slave.py blocks main.py from running the rest of its tasks, I simply want slave.py to be independent of main.py, once I've passed args to it. The two scripts no longer need to communicate.
我在网上找到了一些关于非阻塞 subprocess.call 的帖子,但其中大部分都集中在需要与 slave.py 进行通信的某个点上,而我目前不需要.有谁知道如何以简单的方式实现这个......?
I've found a few posts on the net about non blocking subprocess.call but most of them are centered on requiring communication with slave.py at some-point which I currently do not need. Would anyone know how to implement this in a simple fashion...?
解决方案
你应该使用 subprocess.Popen
而不是 subprocess.call
.
You should use subprocess.Popen
instead of subprocess.call
.
类似:
subprocess.Popen(["python", "slave.py"] + sys.argv[1:])
来自 关于 subprocess.call
的文档:
From the docs on subprocess.call
:
运行 args 描述的命令.等待命令完成,然后返回 returncode 属性.
Run the command described by args. Wait for command to complete, then return the returncode attribute.
(如果您要使用 shell = True
,也不要使用列表来传递参数).
(Also don't use a list to pass in the arguments if you're going to use shell = True
).
这是一个演示非阻塞 suprocess 调用的 MCVE1 示例:
Here's a MCVE1 example that demonstrates a non-blocking suprocess call:
import subprocess
import time
p = subprocess.Popen(['sleep', '5'])
while p.poll() is None:
print('Still sleeping')
time.sleep(1)
print('Not sleeping any longer. Exited with returncode %d' % p.returncode)
另一种依赖于对 python 语言的最新更改以允许基于协同例程的并行性的替代方法是:
An alternative approach that relies on more recent changes to the python language to allow for co-routine based parallelism is:
# python3.5 required but could be modified to work with python3.4.
import asyncio
async def do_subprocess():
print('Subprocess sleeping')
proc = await asyncio.create_subprocess_exec('sleep', '5')
returncode = await proc.wait()
print('Subprocess done sleeping. Return code = %d' % returncode)
async def sleep_report(number):
for i in range(number + 1):
print('Slept for %d seconds' % i)
await asyncio.sleep(1)
loop = asyncio.get_event_loop()
tasks = [
asyncio.ensure_future(do_subprocess()),
asyncio.ensure_future(sleep_report(5)),
]
loop.run_until_complete(asyncio.gather(*tasks))
loop.close()
1使用 python2.7 & 在 OS-X 上测试python3.6
1Tested on OS-X using python2.7 & python3.6
相关文章