一次运行多个 exec 命令(但要等待最后一个完成)

2021-12-29 00:00:00 multithreading exec php zend-framework

我环顾四周,但似乎找不到任何人正试图完全按照我的意愿行事.

I've looked around for this and I can't seem to find anyone who is trying to do exactly what I am.

我有通过 _POST 请求传递给我的函数的信息.基于该数据,我运行 exec 命令来运行 TCL 脚本一定次数(使用不同的参数,基于 post 变量).现在,我在 foreach 中有 exec,所以这需要永远运行(TCL 脚本需要 15 秒左右才能返回,所以如果我需要运行它 100 次,我有一点问题).这是我的代码:

I have information that is passed in to my function via a _POST request. Based on that data, I run an exec command to run a TCL script a certain number of times (with different parameters, based on the post variable). Right now, I have the exec in a foreach so this takes forever to run (the TCL script takes 15 or so seconds to come back, so if I need to run it 100 times, I have a bit of an issue). Here is my code:

    public function executeAction(){
    //code to parse the _POST variable into an array called devices

    foreach($devices as $devID){
        exec("../path/to/script.tcl -parameter1 ".$device['param1']." -parameter2 ".$device['param2'], $execout[$devID]);
    }
    print_r($execout);
}

显然,这段代码只是删除了大块的摘录,但希望它足以证明我正在尝试做的事情.

Obviously this code is just an excerpt with big chunks removed, but hopefully it's enough to demonstrate what I'm trying to do.

我需要一次运行所有的 exec,我需要等待它们全部完成后再返回.我还需要存储在名为 $execout 的数组中的所有脚本的输出.

I need to run all of the execs at once and I need to wait for them all to complete before returning. I also need the output of all of the scripts stored in the array called $execout.

有什么想法吗?

谢谢!!!

推荐答案

如果将 exec() 调用放在单独的脚本中,则可以使用 curl_multi_exec().这样,您可以在单独的请求中进行所有调用,以便它们可以同时执行.轮询 &$still_running 以查看所有请求何时完成,之后您可以收集每个请求的结果.

If you put your exec() call in a separate script, you can call that external script multiple times in parallel using curl_multi_exec(). That way, you'd make all the calls in separate requests, so they could execute simultaneously. Poll &$still_running to see when all requests have finished, after which you can collect the results from each.

更新:这里是 一篇博文 详细说明了我所描述的内容.

Update: Here's a blog post detailing exactly what I'm describing.

根据上面链接的博客文章,我整理了以下示例.

Based on the blog post linked above, I put together the following example.

脚本并行运行:

// waitAndDate.php

<?php
sleep((int)$_GET['time']);
printf('%d secs; %s', $_GET['time'], shell_exec('date'));

脚本并行调用:

// multiExec.php

<?php
$start = microtime(true);

$mh = curl_multi_init();
$handles = array();

// create several requests
for ($i = 0; $i < 5; $i++) {
    $ch = curl_init();

    $rand = rand(5,25); // just making up data to pass to script
    curl_setopt($ch, CURLOPT_URL, "http://domain/waitAndDate.php?time=$rand");
    curl_setopt($ch, CURLOPT_HEADER, 0);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_TIMEOUT, 30);

    curl_multi_add_handle($mh, $ch);
    $handles[] = $ch;
}

// execute requests and poll periodically until all have completed
$isRunning = null;
do {
    curl_multi_exec($mh, $isRunning);
    usleep(250000);
} while ($isRunning > 0);

// fetch output of each request
$outputs = array();
for ($i = 0; $i < count($handles); $i++) {
    $outputs[$i] = trim(curl_multi_getcontent($handles[$i]));
    curl_multi_remove_handle($mh, $handles[$i]);
}

curl_multi_close($mh);

print_r($outputs);
printf("Elapsed time: %.2f seconds
", microtime(true) - $start);

这是我运行几次时收到的一些输出:

Here is some output I received when running it a few times:

Array
(
    [0] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [1] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
    [2] => 18 secs; Mon Apr  2 19:01:43 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:01:36 UTC 2012
    [4] => 8 secs; Mon Apr  2 19:01:33 UTC 2012
)
Elapsed time: 18.36 seconds

Array
(
    [0] => 22 secs; Mon Apr  2 19:02:33 UTC 2012
    [1] => 9 secs; Mon Apr  2 19:02:20 UTC 2012
    [2] => 8 secs; Mon Apr  2 19:02:19 UTC 2012
    [3] => 11 secs; Mon Apr  2 19:02:22 UTC 2012
    [4] => 7 secs; Mon Apr  2 19:02:18 UTC 2012
)
Elapsed time: 22.37 seconds

Array
(
    [0] => 5 secs; Mon Apr  2 19:02:40 UTC 2012
    [1] => 18 secs; Mon Apr  2 19:02:53 UTC 2012
    [2] => 7 secs; Mon Apr  2 19:02:42 UTC 2012
    [3] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
    [4] => 9 secs; Mon Apr  2 19:02:44 UTC 2012
)
Elapsed time: 18.35 seconds

希望有帮助!

一方面要注意:确保您的 Web 服务器可以处理这么多并行请求.如果它按顺序为它们提供服务,或者只能同时提供很少的服务,则这种方法几乎没有收益或没有收益.:-)

One side note: make sure your web server can process this many parallel requests. If it serves them sequentially or can only serve very few simultaneously, this approach gains you little or nothing. :-)

相关文章