How Download Big File Using PHP (Low Memory Usage)

How download big file using PHP (low memory usage)

Copy the file one small chunk at a time

/**
* Copy remote file over HTTP one small chunk at a time.
*
* @param $infile The full URL to the remote file
* @param $outfile The path where to save the file
*/
function copyfile_chunked($infile, $outfile) {
$chunksize = 10 * (1024 * 1024); // 10 Megs

/**
* parse_url breaks a part a URL into it's parts, i.e. host, path,
* query string, etc.
*/
$parts = parse_url($infile);
$i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5);
$o_handle = fopen($outfile, 'wb');

if ($i_handle == false || $o_handle == false) {
return false;
}

if (!empty($parts['query'])) {
$parts['path'] .= '?' . $parts['query'];
}

/**
* Send the request to the server for the file
*/
$request = "GET {$parts['path']} HTTP/1.1\r\n";
$request .= "Host: {$parts['host']}\r\n";
$request .= "User-Agent: Mozilla/5.0\r\n";
$request .= "Keep-Alive: 115\r\n";
$request .= "Connection: keep-alive\r\n\r\n";
fwrite($i_handle, $request);

/**
* Now read the headers from the remote server. We'll need
* to get the content length.
*/
$headers = array();
while(!feof($i_handle)) {
$line = fgets($i_handle);
if ($line == "\r\n") break;
$headers[] = $line;
}

/**
* Look for the Content-Length header, and get the size
* of the remote file.
*/
$length = 0;
foreach($headers as $header) {
if (stripos($header, 'Content-Length:') === 0) {
$length = (int)str_replace('Content-Length: ', '', $header);
break;
}
}

/**
* Start reading in the remote file, and writing it to the
* local file one chunk at a time.
*/
$cnt = 0;
while(!feof($i_handle)) {
$buf = '';
$buf = fread($i_handle, $chunksize);
$bytes = fwrite($o_handle, $buf);
if ($bytes == false) {
return false;
}
$cnt += $bytes;

/**
* We're done reading when we've reached the conent length
*/
if ($cnt >= $length) break;
}

fclose($i_handle);
fclose($o_handle);
return $cnt;
}

Adjust the $chunksize variable to your needs. This has only been mildly tested. It could easily break for a number of reasons.

Usage:

copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');

How to download large files through PHP script

If you use fopen and fread instead of readfile, that should solve your problem.

There's a solution in the PHP's readfile documentation showing how to use fread to do what you want.

How to force download of big files without using too much memory?

There are some ideas over in this thread. I don't know if the readfile() method will save memory, but it sounds promising.

Reading very large files in PHP

Are you sure that it's fopen that's failing and not your script's timeout setting? The default is usually around 30 seconds or so, and if your file is taking longer than that to read in, it may be tripping that up.

Another thing to consider may be the memory limit on your script - reading the file into an array may trip over this, so check your error log for memory warnings.

If neither of the above are your problem, you might look into using fgets to read the file in line-by-line, processing as you go.

$handle = fopen("/tmp/uploadfile.txt", "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
// Process buffer here..
}
fclose($handle);
}

Edit

PHP doesn't seem to throw an error, it just returns false.

Is the path to $rawfile correct relative to where the script is running? Perhaps try setting an absolute path here for the filename.

Downloading big files and writing it locally

You can use curl and the option CURLOPT_FILE to save the downloaded content directly to a file.

set_time_limit(0);
$fp = fopen ('file', 'w+b');
$ch = curl_init('http://remote_url/file');
curl_setopt($ch, CURLOPT_TIMEOUT, 75);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);


Related Topics



Leave a reply



Submit