设置的最大 cURL 连接数是多少? [英] What is the maximum number of cURL connections set by?

查看:35
本文介绍了设置的最大 cURL 连接数是多少?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个脚本,它使用 PHP 中的 curl_multi_* 函数运行 1000 个 cURL 请求.

I have a script which runs 1000 cURL requests using curl_multi_* functions in PHP.

超时背后的瓶颈是什么?

What is the bottleneck behind them timing out?

会不会是CPU使用率?就服务器如何处理出站连接的数量而言,是否有更有效的方法来做到这一点?

Would it be the CPU usage? Is there some more efficient way, in terms of how that number of outbound connections is handled by the server, to do this?

我无法更改功能,请求本身只是对远程 API 的简单调用.我只是想知道限制是什么 - 我是否需要增加服务器、Apache 连接或 CPU 上的内存?(或者我错过的其他东西)

I cannot change the functionality and the requests themselves are simple calls to a remote API. I am just wondering what the limit is - would I need to increase memory on the server, or Apache connections, or CPU? (Or something else I have missed)

推荐答案

您的请求是在单个执行线程中提出的.瓶颈几乎肯定是 CPU,您是否真的看过 curl 多代码运行?......它是令人难以置信的 CPU 饥饿;因为您对处理请求没有足够的控制权.curl_multi 使您可以一次编排 1000 个请求,但这并不是一个好主意.您几乎没有机会有效地使用 curl_multi,因为您无法足够精细地控制执行流程,仅仅为套接字提供服务并对其进行 select() 将占您在观看代码运行时会看到的大量高 CPU 使用率命令行.

Your requests are made in a single thread of execution. The bottleneck is almost certainly CPU, have you ever actually watched curl multi code run ? ... it is incredibly cpu hungry; because you don't really have enough control over dealing with the requests. curl_multi makes it possible for you to orchestrate 1000 requests at once, but this doesn't make it a good idea. You have almost no chance of using curl_multi efficiently because you cannot control the flow of execution finely enough, just servicing the sockets and select()'ing on them will account for a lot of the high CPU usage you would see watching your code run on the command line.

此类任务中CPU使用率高的原因是这个;PHP 旨在运行几分之一秒,尽可能快地完成所有事情.CPU的使用方式通常无关紧要,因为它的时间空间如此之短.当你延长这样的任务时,问题变得更加明显,每个操作码产生的开销对程序员来说都是可见的.

The reasons the CPU usage is high during such tasks is this; PHP is designed to run for a fraction of a second, do everything as fast as it can. It usually does not matter how the CPU is utilized, because it's for such a short space of time. When you prolong a task like this the problem becomes more apparent, the overhead incurred with every opcode becomes visible to the programmer.

我知道你说过你不能改变实现,但仍然需要一个完整的答案.这样的任务比curl multi更适合Threading,你应该开始阅读http://php.net/pthreads,从http://php.net/Thread

I'm aware you have said you cannot change the implementation, but still, for a complete answer. Such a task is far more suitable for Threading than curl multi, and you should start reading http://php.net/pthreads, starting with http://php.net/Thread

在空闲 CPU 上留给自己的设备,即使 1000 个线程也会消耗与 curl_multi 一样多的 CPU,重点是您可以精确控制负责下载响应的每个字节和上传请求的每个字节的代码,如果CPU 使用率是一个问题,您可以通过显式调用 usleep 或以有意义的方式限制连接使用来实现不错"的进程,此外您的请求可以在单独的线程中处理.

Left to their own devices on an idle CPU even 1000 threads would consume as much CPU as curl_multi, the point is that you can control precisely the code responsible for downloading every byte of response and upload every byte of the request, and if CPU usage is a concern you can implement a "nice" process by explicitly calling usleep, or limiting connection usage in a meaningful way, additionally your requests can be serviced in separate threads.

我不建议使用 1000 个线程,很可能不是.要做的事情是设计一个 Stackable(参见文档),它的工作是以漂亮"、高效的方式提出和服务请求,并设计工作池(参见 github/pecl 扩展源上的示例)来执行你的新设计的请求...

I do not suggest that 1000 threads is the thing to do, it is more than likely not. The thing to do would be design a Stackable ( see the documentation ) whose job is to make and service a request in a "nice", efficient way, and design pools ( see examples on github/pecl extension sources ) of workers to execute your newly designed requests ...

这篇关于设置的最大 cURL 连接数是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆