防止在 PHP 中的大请求期间超时
我向 Brightcove 服务器发出大量请求,要求对我的视频中的元数据进行批量更改.似乎它只通过了 1000 次迭代然后停止了 - 任何人都可以帮助调整此代码以防止发生超时吗?它需要进行大约 7000/8000 次迭代.
I'm making a large request to the brightcove servers to make a batch change of metadata in my videos. It seems like it only made it through 1000 iterations and then stopped - can anyone help in adjusting this code to prevent a timeout from happening? It needs to make about 7000/8000 iterations.
<?php
include 'echove.php';
$e = new Echove(
'xxxxx',
'xxxxx'
);
// Read Video IDs
# Define our parameters
$params = array(
'fields' => 'id,referenceId'
);
# Make our API call
$videos = $e->findAll('video', $params);
//print_r($videos);
foreach ($videos as $video) {
//print_r($video);
$ref_id = $video->referenceId;
$vid_id = $video->id;
switch ($ref_id) {
case "":
$metaData = array(
'id' => $vid_id,
'referenceId' => $vid_id
);
# Update a video with the new meta data
$e->update('video', $metaData);
echo "$vid_id updated sucessfully!<br />";
break;
default:
echo "$ref_id was not updated. <br />";
break;
}
}
?>
谢谢!
推荐答案
试试 set_time_limit() 函数.调用 set_time_limit(0)
将删除执行脚本的任何时间限制.
Try the set_time_limit() function. Calling set_time_limit(0)
will remove any time limits for execution of the script.
相关文章