Limiting Bandwidth of Download with cURL
CURLOPT_MAX_RECV_SPEED_LARGE
is the option you want.
Added in curl 7.15.5. Present in PHP/CURL since PHP 5.4.0
Do I need to set download speed limit for concurent downloads (using cURL) or they will take equal shares?
The rate limit curl supports is set per "handle", which makes it per single transfer.
Networks in general are designed to handle many connections sharing a tight shared resource, so in most situations you won't need to limit any transfer rates at all.
is there a recommended limit to using parallel CURL operations
For 1000 CURL in parallel you will need a good bandwidth, the recommended amount of parallel requests can be found by debugging, set a function to check for timed-out connections and in that case decrease / increase the limit, or timeout period.
How to speed up cURL in php?
With respect to environment, I've observed in PHP that cURL typically runs very fast in most environments except in places where there is low CPU and there is slower network performance. For example, on localhost on my MAMP installation, curl is fast, on a larger amazon instance, curl is fast. But on a small crappy hosting, i've seen it have performance issues where it is noticeably slower to connect. Though, i'm not sure exactly why that is slower. Also, it sure wasn't 5 seconds slower.
to help determine if its PHP or your environment, you should try interacting with curl via the command line. At least that you'll be able to rule out PHP code being the problem if its still 5 seconds.
How do I measure request and response times at once using cURL?
From this brilliant blog post... https://blog.josephscott.org/2011/10/14/timing-details-with-curl/
cURL supports formatted output for the details of the request (see the cURL manpage for details, under -w, –write-out <format>
). For our purposes we’ll focus just on the timing details that are provided. Times below are in seconds.
Create a new file,
curl-format.txt
, and paste in:time_namelookup: %{time_namelookup}s\n
time_connect: %{time_connect}s\n
time_appconnect: %{time_appconnect}s\n
time_pretransfer: %{time_pretransfer}s\n
time_redirect: %{time_redirect}s\n
time_starttransfer: %{time_starttransfer}s\n
----------\n
time_total: %{time_total}s\nMake a request:
curl -w "@curl-format.txt" -o /dev/null -s "http://wordpress.com/"
Or on Windows, it's...
curl -w "@curl-format.txt" -o NUL -s "http://wordpress.com/"
What this does:
-w "@curl-format.txt"
tells cURL to use our format file-o /dev/null
redirects the output of the request to /dev/null-s
tells cURL not to show a progress meter"http://wordpress.com/"
is
the URL we are requesting. Use quotes particularly if your URL has "&" query string parameters
And here is what you get back:
time_namelookup: 0.001s
time_connect: 0.037s
time_appconnect: 0.000s
time_pretransfer: 0.037s
time_redirect: 0.000s
time_starttransfer: 0.092s
----------
time_total: 0.164s
I have not yet seen an option to output the results in microseconds, but if you're aware of one, post in the comments below.
Make a Linux/Mac shortcut (alias)
alias curltime="curl -w \"@$HOME/.curl-format.txt\" -o /dev/null -s "
Then you can simply call...
curltime wordpress.org
Thanks to commenter Pete Doyle!
Make a Linux/Mac stand-alone script
This script does not require a separate .txt
file to contain the formatting.
Create a new file, curltime
, somewhere in your executable path, and paste in:
#!/bin/bash
curl -w @- -o /dev/null -s "$@" <<'EOF'
time_namelookup: %{time_namelookup}\n
time_connect: %{time_connect}\n
time_appconnect: %{time_appconnect}\n
time_pretransfer: %{time_pretransfer}\n
time_redirect: %{time_redirect}\n
time_starttransfer: %{time_starttransfer}\n
----------\n
time_total: %{time_total}\n
EOF
Then call it the same way as the alias:
curltime wordpress.org
Make a Windows shortcut (aka BAT file)
Create a new text file called curltime.bat
in the same folder as curl.exe
and curl-format.txt
, and paste in the following line:
curl -w "@%~dp0curl-format.txt" -o NUL -s %*
Then from the command line you can simply call:
curltime wordpress.org
(Make sure the folder is listed in your Windows PATH
variable to be able to use the command from any folder.)
PHP kill ongoing multi curl requests
well, you should be able to cancel them at will with a CURLOPT_PROGRESSFUNCTION, have a global variable for wether or not to cancel transfers, a function that import it (with the global $var
syntax), and make it return 1 when its time to cancel, eg $abort=false;ecurl_setopt($ch,CURLOPT_PROGRESSFUNCTION,function($a,$b,$c,$d,$e){global $abort;return (int)!$abort;});
- then just make $abort=true; when its time to abort them. that said, you can use CURLOPT_MAX_RECV_SPEED_LARGE to limit the speed of the transfers, if its a speed rate limit you're exceeding
edit: note that you also need to set CURLOPT_NOPROGRESS to false for CURLOPT_PROGRESSFUNCTION to be called at all.
Related Topics
Make Bash Differentiate Between Ctrl-<Letter> and Ctrl-Shift-<Letter>
Git Post-Receive Checkout to Remote Machine
Ubuntu/Fedora: How to Add Applications in The Menus
Linux/Unix Socket Self-Connection
Linux: Update Embedded Resource from Executable
What Is The Correct Way to Define a Netfilter Hook Function
Raising Hard Limit on Rlimit_Nofile System-Wide on Linux
When Is Posix Thread Cancellation Not Immediate
Trouble Ssh Tunneling to Remote Server
How to Use Ptrace(2) to Change Behaviour of Syscalls
Syntax Error Near Unexpected Token 'Do' When Run with Sudo
Conditional Awk Hashmap Match Lookup
Cdc_Acm: Failed to Set Dtr/Rts - Can Not Communicate with Usb Cdc Device
Pyinstaller on 32-Bit Linux - Importerror: The 'six' Package Is Required