Asynchronous Http Requests in PHP

How to make asynchronous HTTP requests in PHP

The answer I'd previously accepted didn't work. It still waited for responses. This does work though, taken from How do I make an asynchronous GET request in PHP?

function post_without_wait($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);

$parts=parse_url($url);

$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);

$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;

fwrite($fp, $out);
fclose($fp);
}

How do I make an asynchronous GET request in PHP?

file_get_contents will do what you want

$output = file_get_contents('http://www.example.com/');
echo $output;

Edit: One way to fire off a GET request and return immediately.

Quoted from http://petewarden.typepad.com/searchbrowser/2008/06/how-to-post-an.html

function curl_post_async($url, $params)
{
foreach ($params as $key => &$val) {
if (is_array($val)) $val = implode(',', $val);
$post_params[] = $key.'='.urlencode($val);
}
$post_string = implode('&', $post_params);

$parts=parse_url($url);

$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);

$out = "POST ".$parts['path']." HTTP/1.1\r\n";
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
if (isset($post_string)) $out.= $post_string;

fwrite($fp, $out);
fclose($fp);
}

What this does is open a socket, fire off a get request, and immediately close the socket and return.

PHP Asynchronous HTTP request with response

file_get_contents is a synchronous function, so when you call it 10 times, you call it 10 times sequentially. You need to make requests in parallel. curl_multi_* methods family is probably what you are looking for. If you are not familiar with curl enough it may be tricky to implement it correctly, so I'd recommend you to use some library for that.

You can take a look at this library https://github.com/petewarden/ParallelCurl

Async http call with php

I am going to use React as I know the library better but they work in similar ways.

EDIT: updated, see comments

This will read in a file and every time it recieves a chunk of data, it will create an api call and send the data off

<?php

require_once __DIR__ . '/vendor/autoload.php';

function async_send($config, $file, callable $proccessor)
{

$config['ssl'] = true === $config['ssl'] ? 's' : '';
$client = new \GuzzleHttp\Client([
'base_uri' => 'http' . $config['ssl'] . '://' . $config['domain'] . '/rest/all/V1/',
'verify' => false,
'http_errors' => false
]);
$loop = \React\EventLoop\Factory::create();
$filesystem = \React\Filesystem\Filesystem::create($loop);
$filesystem->getContents($file)->then(function($contents) use ($config, $proccessor, $client) {
$contents = $proccessor($contents);
$client->post($config['uri'], ['body' => $contents]);
});
}

$config = [
'domain' => 'example.com',
'ssl' => true
];
//somewhere later
$configp['uri'] = 'products';
async_send($configp, __DIR__ . 'my.csv', function ($contents) {
return json_encode($contents);
});

Is making asynchronous HTTP requests possible with PHP?

Yes.

There is the multirequest PHP library (or see: archived Google Code project). It's a multi-threaded CURL library.

As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

Does PHP 7 handle requests asynchronously by default?

You have misunderstood what is being discussed regarding blocking I/O.

Each time you run a PHP script from the command-line, it executes in a completely separate process, which doesn't know about any other PHP script running on the machine. You can execute it multiple times for exactly the same reason you can run a web browser and a text editor at the same time - the OS is scheduling the processes on one or more processor cores.

In the web server example, it's slightly more complex, but the same principle applies: each request you make to the web server creates either a new process, or a new thread within the process, and the PHP script runs inside that.

What people are discussing regarding blocking I/O is something completely different: when you access something external within your PHP code, such as fetching web content with an HTTP request. Imagine this loop, using an imaginary HTTP library:

foreach ( $urls as $url ) {
$results[] = $httpClient->fetchUrl($url);
}

If written using the built-in PHP functionality, this will stop running and wait for the remote server to respond each time around the loop. So for 10 requests each taking a second, it will take 10 seconds to complete the loop.

"Non-blocking I/O" means implementing functions like fetchUrl so that they return immediately, even though the HTTP response hasn't come back. That way, you can run lots of HTTP requests at once: in the example loop, you would get all 10 responses back after 1 second. You'd then have some way to get the results, such as a "promise".

It's possible to implement this in PHP, but there are no native functions which do it by default and in a user-friendly way.

More complex still, a system can implement this for input: whenever you're waiting for something external, you can check if a new web request has come in, and start processing that. This can effectively mimic multiple threads, which is how NodeJS is able to handle a large amount of traffic with a thread per request.



Related Topics



Leave a reply



Submit