Streaming a Large File Using PHP

Streaming a large file using PHP

Try something like this (source http://teddy.fr/2007/11/28/how-serve-big-files-through-php/):

<?php
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk

// Read a file and display its content chunk by chunk
function readfile_chunked($filename, $retbytes = TRUE) {
$buffer = '';
$cnt = 0;
$handle = fopen($filename, 'rb');

if ($handle === false) {
return false;
}

while (!feof($handle)) {
$buffer = fread($handle, CHUNK_SIZE);
echo $buffer;
ob_flush();
flush();

if ($retbytes) {
$cnt += strlen($buffer);
}
}

$status = fclose($handle);

if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}

return $status;
}

// Here goes your code for checking that the user is logged in
// ...
// ...

if ($logged_in) {
$filename = 'path/to/your/file';
$mimetype = 'mime/type';
header('Content-Type: '.$mimetype );
readfile_chunked($filename);

} else {
echo 'Tabatha says you haven\'t paid.';
}
?>

Google Drive PHP API - How to stream a large file

Yes it is possible to specify how much of the file to get there by getting it a chunk at a time. But i'm not sure if you can actually read what is in the file until its fully downloaded.

Google drive SDK download files - Partial download

Partial download

Partial download involves downloading only a specified portion of a file. You 
can specify the portion of the file you want to download by using a byte range
with the Range header. For example:
Range: bytes=500-999

I did a quick scan of the php client lib and I'm not sure that it supports it. This may be something that needs to be added to the client lib or its something that you will have to code on your own with out using the client lib.

Reading very large files in PHP

Are you sure that it's fopen that's failing and not your script's timeout setting? The default is usually around 30 seconds or so, and if your file is taking longer than that to read in, it may be tripping that up.

Another thing to consider may be the memory limit on your script - reading the file into an array may trip over this, so check your error log for memory warnings.

If neither of the above are your problem, you might look into using fgets to read the file in line-by-line, processing as you go.

$handle = fopen("/tmp/uploadfile.txt", "r") or die("Couldn't get handle");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
// Process buffer here..
}
fclose($handle);
}

Edit

PHP doesn't seem to throw an error, it just returns false.

Is the path to $rawfile correct relative to where the script is running? Perhaps try setting an absolute path here for the filename.

Streaming Large Files from external Web Service through PHP

Well I feel silly. This turned out to be just another PHP-ism. Apparently even though I was flushing the output buffer with ob_flush which (I thought) should have been sending the headings and chunks to the browser, the headers and output weren't actually getting flushed to the browser until the script finished.

Even though the output is self was getting flushed, you still have to explicitly flush the write buffers of PHP and the web server back to the client. Not doing this lead to the memory expansion, and the download prompt not showing until the entire download completed.

Here is a version of the working method:

public function download()
{
$file_info ... //Assume init'ed from WS or DB

//Allow for long running process
set_time_limit(0);

//File Info
$filesize = $file_info->get_filesize();
$fileid = $file_info->get_id();
$filename = $file_info->get_name();
$offset = 0;
$chunksize = (1024 * 1024);

//Clear any previous data
ob_clean();
ob_start();

//Output headers to notify browser it's a download
header('Content-Type: application/octet-stream');
header('Content-Length: ' . $filesize);
header('Content-Disposition: attachment; filename="' . $filename . '"');

while($offset < $filesize)
{
//Retrieve chunk from service
$chunk = $this->dl_service()->download_chunked_file($fileid, $offset, $chunksize);
if($chunk)
{
//Immediately echo out the stream
$chunk->render();
//NOTE: The order of flushing IS IMPORTANT
//Flush the data to the output buffer
ob_flush();
//Flush the write buffer directly to the browser
flush();
//Cleanup and prepare next request
$offset += $chunksize;
unset($chunk);
}
}
//Exit the script immediately to prevent other output from corrupting the file
exit(0);
}

Serving large files with PHP

You don't need to read the whole thing - just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you....

$name = 'mybigfile.zip';
$fp = fopen($name, 'rb');

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
fpassthru($fp);
exit;

even less lines if you use readfile, which doesn't need the fopen call...

$name = 'mybigfile.zip';

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
readfile($name);
exit;

If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It's a bit involved, but see this for an example.



Related Topics



Leave a reply



Submit