Super Fast Getimagesize in PHP

Super fast getimagesize in php


function ranger($url){
$headers = array(
"Range: bytes=0-32768"
);

$curl = curl_init($url);
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}

$start = microtime(true);

$url = "http://news.softpedia.com/images/news2/Debian-Turns-15-2.jpeg";

$raw = ranger($url);
$im = imagecreatefromstring($raw);

$width = imagesx($im);
$height = imagesy($im);

$stop = round(microtime(true) - $start, 5);

echo $width." x ".$height." ({$stop}s)";

test...

640 x 480 (0.20859s)

Loading 32kb of data worked for me.

getimagesize very slow on remote image

its depend on server of each site , how fast they are responding your request. Everything runs at least as fast on the VPS as it does on the shared account, except for getimagesize() which is vastly slower.
Here is quick solution
1. get file on your server using file_get_contents or curl for temporary purpose.
2. get dimension getimagesize
3. delete the file
alternative
else you can also find some api , where you can submit link or image , they will give info + thumbnails. Info is actually what you are looking for.

PHP getimagesize empty output

Faking the HTTP referer field seems to work on this one:

<?php
function getimgsize($url, $referer = '')
{
$headers = array(
'Range: bytes=0-32768'
);

/* Hint: you could extract the referer from the url */
if (!empty($referer)) array_push($headers, 'Referer: '.$referer);

$curl = curl_init($url);
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($curl);
curl_close($curl);

$image = imagecreatefromstring($data);

$return = array(imagesx($image), imagesy($image));

imagedestroy($image);

return $return;
}

list($width, $heigth) = getimgsize('http://cor-forum.de/forum/images/smilies/zombie.png', 'http://cor-forum.de/forum/');

echo $width.' x '.$heigth;
?>

Source of code

PHP getimagesize not working on external links

allow_url_fopen is off. Either enable it on your php.ini, or use curl to get the image:

function getImg($url){
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}

$url = "http://inspiring-photography.com/wp-content/uploads/2012/04/Culla-bay-rock-formations-Isle-of-Uist.jpg";
$raw = getImg($url);
$im = imagecreatefromstring($raw);
$width = imagesx($im);
$height = imagesy($im);
echo $width." x ".$height;

SRC

php get image width / length with curl? [ without getimagesize() ]

Are you serving these images yourself ? on your server/app ?

If so, you could put the width / height information in some custom HTTP Headers in your images (you'll need to handle the access to your image by whatever PHP solution that modify headers before outputing image.. with good cache settings).

This headers can be based on width/height calculated at upload for example and stored in database.. so it will need one and only getimagesize for ever..

After that, in curl just check the headers (without downloading the file). If the width / height match your requirement, you can then really download it.

php get all the images from url which width and height =200 more quicker

I think what you use do is run curl requests in parallel using curl_multi_init please see http://php.net/manual/en/function.curl-multi-init.php for more information. This way it will load much faster and escape all bandwidth issue that can also affect speed.

Save the image into a local temp directory not run getimagesize() on the local directly which is much faster than running it over http://

I hope this helps

Edit 1

Note***

A. Not all Images start with http

B. Not all images are valid

C. Create temp folder where the images needs to be stored

Prove of Concept

require 'simple_html_dom.php';
$url = 'http://www.huffingtonpost.com';
$html = file_get_html ( $url );
$nodes = array ();
$start = microtime ();
$res = array ();

if ($html->find ( 'img' )) {
foreach ( $html->find ( 'img' ) as $element ) {
if (startsWith ( $element->src, "/" )) {
$element->src = $url . $element->src;
}
if (! startsWith ( $element->src, "http" )) {
$element->src = $url . "/" . $element->src;
}
$nodes [] = $element->src;
}
}

echo "<pre>";
print_r ( imageDownload ( $nodes, 200, 200 ) );
echo "<h1>", microtime () - $start, "</h1>";

function imageDownload($nodes, $maxHeight = 0, $maxWidth = 0) {

$mh = curl_multi_init ();
$curl_array = array ();
foreach ( $nodes as $i => $url ) {
$curl_array [$i] = curl_init ( $url );
curl_setopt ( $curl_array [$i], CURLOPT_RETURNTRANSFER, true );
curl_setopt ( $curl_array [$i], CURLOPT_USERAGENT, 'Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729)' );
curl_setopt ( $curl_array [$i], CURLOPT_CONNECTTIMEOUT, 5 );
curl_setopt ( $curl_array [$i], CURLOPT_TIMEOUT, 15 );
curl_multi_add_handle ( $mh, $curl_array [$i] );
}
$running = NULL;
do {
usleep ( 10000 );
curl_multi_exec ( $mh, $running );
} while ( $running > 0 );

$res = array ();
foreach ( $nodes as $i => $url ) {
$curlErrorCode = curl_errno ( $curl_array [$i] );

if ($curlErrorCode === 0) {
$info = curl_getinfo ( $curl_array [$i] );
$ext = getExtention ( $info ['content_type'] );
if ($info ['content_type'] !== null) {
$temp = "temp/img" . md5 ( mt_rand () ) . $ext;
touch ( $temp );
$imageContent = curl_multi_getcontent ( $curl_array [$i] );
file_put_contents ( $temp, $imageContent );
if ($maxHeight == 0 || $maxWidth == 0) {
$res [] = $temp;
} else {
$size = getimagesize ( $temp );
if ($size [1] >= $maxHeight && $size [0] >= $maxWidth) {
$res [] = $temp;
} else {
unlink ( $temp );
}
}
}
}
curl_multi_remove_handle ( $mh, $curl_array [$i] );
curl_close ( $curl_array [$i] );

}

curl_multi_close ( $mh );
return $res;
}

function getExtention($type) {
$type = strtolower ( $type );
switch ($type) {
case "image/gif" :
return ".gif";
break;
case "image/png" :
return ".png";
break;

case "image/jpeg" :
return ".jpg";
break;

default :
return ".img";
break;
}
}

function startsWith($str, $prefix) {
$temp = substr ( $str, 0, strlen ( $prefix ) );
$temp = strtolower ( $temp );
$prefix = strtolower ( $prefix );
return ($temp == $prefix);
}

Output

Array
(
[0] => temp/img8cdd64d686ee6b925e8706fa35968da4.gif
[1] => temp/img5811155f8862cd0c3e2746881df9cd9f.gif
[2] => temp/imga597bf04873859a69373804dc2e2c27e.jpg
[3] => temp/img0914451e7e5a6f4c883ad7845569029e.jpg
[4] => temp/imgb1c8c4fa88d0847c99c6f4aa17a0a457.jpg
[5] => temp/img36e5da68a30df7934a26911f65230819.jpg
[6] => temp/img068c1aa705296b38f2ec689e5b3172b9.png
[7] => temp/imgfbeca2410b9a9fb5c08ef88dacd46895.png
)
0.076347

Thanks
:)

getimagesize() limiting file size for remote URL

You can download the file separately, imposing a maximum size you wish to download:

function mygetimagesize($url, $max_size = -1)
{
// create temporary file to store data from $url
if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
return false;
}
// open input and output
if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
unlink($tmpfname);
return false;
}
// copy at most $max_size bytes
stream_copy_to_stream($in, $out, $max_size);

// close input and output file
fclose($in); fclose($out);

// retrieve image information
$info = getimagesize($tmpfname);

// get rid of temporary file
unlink($tmpfname);

return $info;
}


Related Topics



Leave a reply



Submit