Is Making Asynchronous Http Requests Possible with PHP

How to make asynchronous HTTP requests in PHP

The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

function post_without_wait($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);

$parts=parse_url($url);

$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);

$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;

fwrite($fp, $out);
fclose($fp);
}

Is making asynchronous HTTP requests possible with PHP?

Yes.

There is the multirequest PHP library (or see: archived Google Code project). It's a multi-threaded CURL library.

As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

PHP Asynchronous HTTP request with response

file_get_contents is a synchronous function, so when you call it 10 times, you call it 10 times sequentially. You need to make requests in parallel. curl_multi_* methods family is probably what you are looking for. If you are not familiar with curl enough it may be tricky to implement it correctly, so I'd recommend you to use some library for that.

You can take a look at this library https://github.com/petewarden/ParallelCurl

How do I make an asynchronous GET request in PHP?

file_get_contents will do what you want

$output = file_get_contents('http://www.example.com/');
echo $output;

Edit: One way to fire off a GET request and return immediately.

Quoted from http://petewarden.typepad.com/searchbrowser/2008/06/how-to-post-an.html

function curl_post_async($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);

$parts=parse_url($url);

$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);

$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;

fwrite($fp, $out);
fclose($fp);
}

What this does is open a socket, fire off a get request, and immediately close the socket and return.

Asynchronous HTTP request in PHP with delegate

PHP doesn't really support async callbacks since there are no interrupts in the language. Your best of using curl, and then checking back on the request to see if it has finished. You can also use fsockopen and friends to do it on a socket level.

Check out this post. You'll need to tweak it to save the result socket (and process it), but the basic idea is there.

Asynchronous HTTP server side browser request

Below is a tested asynchronous PHP request.

It can be run from the same server or any server.

Accuracy should be within a millisecond.

Reducing usleep(1000) will increase accuracy but decrease CPU availability.

Before down voting my answer please make sure you understand what I have done.

In my testing I monitored the CPU usage. Typical usage was 0.1% CPU and max 0.2%

In my script below this is the HTTP Request Header:

$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";

It can be edited to include anything you need in the header.

To add User Agent:

$ua = 'User-Agent: Mozilla/5.0 (Windows NT 5.1; rv:48.0) Gecko/20100101 Firefox/48.0'

Then add crlf and the $ua

$http = "GET $path HTTP/1.0\r\nHost: $host\r\n$ua\r\n\r\n";

Test Scripts

This PHP script is http://ispeedlink.com/so/asyncTimer/index.php

It makes an HTTP request to http://ispeedlink.com/so/asyncTimer/log.php "exactly" every second

<?php
$host = 'ispeedlink.com';
$path = '/so/asyncTimer/log.php';
$http = "GET $path HTTP/1.0\r\nHost: $host\r\n\r\n";
$cntr = 0;
while(true){
if(((microtime(true) * 1000) % 10) < 1){ // make http request as soon as system clock passes one second + 10mS
$stream = stream_socket_client("$host:80", $errno,$errstr, 100,STREAM_CLIENT_ASYNC_CONNECT|STREAM_CLIENT_CONNECT);
if ($stream) {
fwrite($stream, $http);
}
else {
file_put_contents('error.txt',"Failed: $errno,$errstr\n",FILE_APPEND);
}
usleep(990000); // wait 990 milliseconds so microtime() * 1000 % 10 will be greater than 10 until next second
$cntr++;
if($cntr > 9){break;}
}
usleep(1000); // release CPU for at least a millisecond.

}

header('Content-Type: text/plain; charset=utf-8');
readfile('log.txt');
file_put_contents('log.txt',"\n",FILE_APPEND);

?>

For this test I added a counter to allow the script to make just 10 requests:

$cntr++;
if($cntr > 9){break;}

This is log.php which is requested from index.php every second.

<?php
$time = microtime(true) . "\n";
file_put_contents('log.txt',$time,FILE_APPEND);
?>

In my Browser I made a request to http://ispeedlink.com/so/asyncTimer/
two times with a pause in between.

Here is the log.txt

1462802817.32
1462802818.32
1462802819.32
1462802820.32
1462802821.32
1462802822.32
1462802823.32
1462802824.32
1462802825.32
1462802826.33

1462802836.72
1462802837.71
1462802838.72
1462802839.72
1462802840.72
1462802841.72
1462802842.72
1462802843.72
1462802844.72
1462802845.71

If you need to see the response from the request .

Add this line of code after the request:

$sockets[] = $stream; 

Which will then look like this:

if ($stream) {
fwrite($stream, $http);
$sockets[] = $stream;
}

And add this routine

The response for each request will be in the $response array.

while (count($sockets)) {
$read = $sockets;
stream_select($read, $write = NULL, $except = NULL, 100);
if (count($read)) {
foreach ($read as $r) {
$id = array_search($r, $sockets);
$data = fread($r, $buffer_size);
if (strlen($data) == 0) {
fclose($r);
unset($sockets[$id]);
}
else {
$response[$id] .= $data;
}
}
}
else {
break;
}
}

CPU usage stats:

%CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
0.2 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0.1 0.0 233524 11236 ? S 11:10 0:00 /usr/bin/php /home/usr/public_html/so/asyncTimer/index.php
0

Asynchronous/parallel HTTP requests using PHP curl_multi

You can't do multi-threading in PHP, so you won't be able to start processing one page while the others are still being retrieve. Multi-curl won't return control until all pages are retrieved or timeout. So it will take as long the it takes for the slowest page to be retrieved. You are going from serial (curl) to parallel (multi_curl), which will still give you a big boost.

Servers will serve multiple pages to the same client up to a certain configure limit. Requesting 5-10 pages from a server would be fine.



Related Topics



Leave a reply



Submit