PHP 警告:exec() 无法分叉
所以这里有一些关于我的设置的背景信息.使用 apache 和 php 5.2.17 运行 Centos.我有一个网站,列出了许多不同零售商网站的产品.我有爬虫脚本运行以从每个网站抓取产品.由于每个网站都不同,因此必须自定义每个爬虫脚本以爬取特定的零售商网站.所以基本上我每个零售商都有 1 个爬虫.此时我有 21 个爬虫,它们不断运行以从这些网站收集和刷新产品.每个爬虫都是一个 php 文件,一旦 php 脚本运行完毕,它会检查以确保它自己的唯一实例正在运行,并且在脚本的最后,它使用 exec 重新启动自己,同时原始实例关闭.这有助于防止内存泄漏,因为每个爬虫在关闭之前都会自行重启.但是最近我会检查爬虫脚本并注意到其中一个不再运行,并且在错误日志中我发现以下内容.
PHP 警告:exec() [<a href='function.exec'>function.exec</a>]: 无法 fork [nice -n 20 php -q/home/blahblah/crawler_script.php >/dev/null &]
这应该是重新启动这个特定爬虫的原因,但是由于它无法分叉"它从未重新启动,并且爬虫的原始实例像往常一样结束.
显然这不是权限问题,因为这 21 个爬虫脚本中的每一个在其运行结束时每 5 或 10 分钟运行一次此 exec 命令,并且大部分时间它都按应有的方式运行.这似乎每天发生一次或两次.似乎它是某种限制,因为自从我添加了第 21 个爬虫后,我才刚刚开始看到这种情况发生.而且它并不总是同一个爬虫得到这个错误,它会是随机时间中的任何一个无法分叉其重新启动执行命令的爬虫.
有没有人知道是什么导致 php 无法分叉,或者甚至是更好的方法来处理这些进程以一起解决错误?是否有我应该研究的过程限制或类似的东西?提前感谢您的帮助!
解决方案进程限制
<块引用>是否有我应该研究的进程限制"
怀疑有人(系统管理员?)设置了最大用户进程
的限制.你能试试这个吗?
$ ulimit -a…………最大用户进程 (-u) 16384……
在 PHP 中运行前面的命令.类似的东西:
回显系统(ulimit -a");
我搜索了php.ini或httpd.conf是否有这个限制,但没有找到.
错误处理
<块引用>甚至更好的方法来处理这些过程以一起解决错误?"
exec()
的第三个参数返回$cmd
的退出码.0 表示成功,非零表示错误代码.参考 http://php.net/function.exec .
exec($cmd, &$output, &$ret_val);如果 ($ret_val != 0){//在这里做事}别的{echo "成功
";}
So here is a little background info on my setup. Running Centos with apache and php 5.2.17. I have a website that lists products from many different retailers websites. I have crawler scripts that run to grab products from each website. Since every website is different, each crawler script had to be customized to crawl the particular retailers website. So basically I have 1 crawler per retailer. At this time I have 21 crawlers that are constantly running to gather and refresh the products from these websites. Each crawler is a php file and once the php script is done running it checks to ensure its the only instance of itself running and at the very end of the script it uses exec to start itself all over again while the original instance closes. This helps protect against memory leaks since each crawler restarts itself before it closes. However recently I will check the crawler scripts and notice that one of them Isnt running anymore and in the error log I find the following.
PHP Warning: exec() [<a href='function.exec'>function.exec</a>]: Unable to fork [nice -n 20 php -q /home/blahblah/crawler_script.php >/dev/null &]
This is what is supposed to start this particular crawler over again however since it was "unable to fork" it never restarted and the original instance of the crawler ended like it normally does.
Obviously its not a permission issue because each of these 21 crawler scripts runs this exec command every 5 or 10 minutes at the end of its run and most of the time it works as it should. This seems to happen maybe once or twice a day. It seems as though its a limit of some sort as I have only just recently started to see this happen ever since I added my 21st crawler. And its not always the same crawler that gets this error it will be any one of them at a random time that are unable to fork its restart exec command.
Does anyone have an idea what could be causing php to be unable to fork or maybe even a better way to handle these processes as to get around the error all together? Is there a process limit I should look into or something of that nature? Thanks in advance for help!
解决方案Process limit
"Is there a process limit I should look into"
It's suspected somebody (system admin?) set limitation of max user process
. Could you try this?
$ ulimit -a
....
....
max user processes (-u) 16384
....
Run preceding command in PHP. Something like :
echo system("ulimit -a");
I searched whether php.ini or httpd.conf has this limit, but I couldn't find it.
Error Handling
"even a better way to handle these processes as to get around the error all together?"
The third parameter of exec()
returns exit code of $cmd
. 0 for success, non zero for error code. Refer to http://php.net/function.exec .
exec($cmd, &$output, &$ret_val);
if ($ret_val != 0)
{
// do stuff here
}
else
{
echo "success
";
}
相关文章