cURL多线程与PHP [英] cURL Multi Threading with PHP
问题描述
我使用cURL来获取我已经存储在数据库中的超过20,000个域名的一些排名数据。
我使用的代码是 http://semlabs.co.uk/journal/object-oriented-curl-class-with-multithreading 。
数组$ competeRequests是20,000个请求compet.com api的网站排名。
由于这些请求有20,000个,我想将它们分成块,所以我使用下面的代码来实现:
foreach(array_chunk($ competeRequests,1000)as $ requests){
foreach请求作为$请求){
$ curl-> addSession($ request,$ opts);
}
}
这对于以1,000个批次发送请求非常有用,但脚本执行时间太长,我将max_execution_time增加到10分钟以上。 p>
有一种方法可以从我的数组发送1,000个请求,然后解析结果,然后输出状态更新,然后继续下一个1,000,直到数组为空?
这一个总是为我做的工作... https://github.com/petewarden/ParallelCurl
I'm using cURL to get some rank data for over 20,000 domain names that I've got stored in a database.
The code I'm using is http://semlabs.co.uk/journal/object-oriented-curl-class-with-multi-threading.
The array $competeRequests is 20,000 request to compete.com api for website ranks.
This is an example request: http://apps.compete.com/sites/stackoverflow.com/trended/rank/?apikey=xxxx&start_date=201207&end_date=201208&jsonp=";
Since there are 20,000 of these requests I want to break them up into chunks so I'm using the following code to accomplish that:
foreach(array_chunk($competeRequests, 1000) as $requests) {
foreach($requests as $request) {
$curl->addSession( $request, $opts );
}
}
This works great for sending the requests in batches of 1,000 however the script takes too long to execute. I've increased the max_execution_time to over 10 minutes.
Is there a way to send 1,000 requests from my array then parse the results then output a status update then continue with the next 1,000 until the array is empty? As of now the screen just stays white the entire time the script is executing which can be over 10 minutes.
This one always does the job for me... https://github.com/petewarden/ParallelCurl
这篇关于cURL多线程与PHP的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!