Fastest Way to Serve a File Using PHP

Fastest Way to Serve a File Using PHP

My previous answer was partial and not well documented, here is an update with a summary of the solutions from it and from others in the discussion.

The solutions are ordered from best solution to worst but also from the solution needing the most control over the web server to the one needing the less. There don't seem to be an easy way to have one solution that is both fast and work everywhere.


Using the X-SendFile header

As documented by others it's actually the best way. The basis is that you do your access control in php and then instead of sending the file yourself you tell the web server to do it.

The basic php code is :

header("X-Sendfile: $file_name");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename="' . basename($file_name) . '"');

Where $file_name is the full path on the file system.

The main problem with this solution is that it need to be allowed by the web server and either isn't installed by default (apache), isn't active by default (lighttpd) or need a specific configuration (nginx).

Apache

Under apache if you use mod_php you need to install a module called mod_xsendfile then configure it (either in apache config or .htaccess if you allow it)

XSendFile on
XSendFilePath /home/www/example.com/htdocs/files/

With this module the file path could either be absolute or relative to the specified XSendFilePath.

Lighttpd

The mod_fastcgi support this when configured with

"allow-x-send-file" => "enable" 

The documentation for the feature is on the lighttpd wiki they document the X-LIGHTTPD-send-file header but the X-Sendfile name also work

Nginx

On Nginx you can't use the X-Sendfile header you must use their own header that is named X-Accel-Redirect. It is enabled by default and the only real difference is that it's argument should be an URI not a file system. The consequence is that you must define a location marked as internal in your configuration to avoid clients finding the real file url and going directly to it, their wiki contains a good explanation of this.

Symlinks and Location header

You could use symlinks and redirect to them, just create symlinks to your file with random names when an user is authorized to access a file and redirect the user to it using:

header("Location: " . $url_of_symlink);

Obviously you'll need a way to prune them either when the script to create them is called or via cron (on the machine if you have access or via some webcron service otherwise)

Under apache you need to be able to enable FollowSymLinks in a .htaccess or in the apache config.

Access control by IP and Location header

Another hack is to generate apache access files from php allowing the explicit user IP. Under apache it mean using mod_authz_host (mod_access) Allow from commands.

The problem is that locking access to the file (as multiple users may want to do this at the same time) is non trivial and could lead to some users waiting a long time. And you still need to prune the file anyway.

Obviously another problem would be that multiple people behind the same IP could potentially access the file.

When everything else fail

If you really don't have any way to get your web server to help you, the only solution remaining is readfile it's available in all php versions currently in use and work pretty well (but isn't really efficient).


Combining solutions

In fine, the best way to send a file really fast if you want your php code to be usable everywhere is to have a configurable option somewhere, with instructions on how to activate it depending on the web server and maybe an auto detection in your install script.

It is pretty similar to what is done in a lot of software for

  • Clean urls (mod_rewrite on apache)
  • Crypto functions (mcrypt php module)
  • Multibyte string support (mbstring php module)

Serving large files with PHP

You don't need to read the whole thing - just enter a loop reading it in, say, 32Kb chunks and sending it as output. Better yet, use fpassthru which does much the same thing for you....

$name = 'mybigfile.zip';
$fp = fopen($name, 'rb');

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
fpassthru($fp);
exit;

even less lines if you use readfile, which doesn't need the fopen call...

$name = 'mybigfile.zip';

// send the right headers
header("Content-Type: application/zip");
header("Content-Length: " . filesize($name));

// dump the file and stop the script
readfile($name);
exit;

If you want to get even cuter, you can support the Content-Range header which lets clients request a particular byte range of your file. This is particularly useful for serving PDF files to Adobe Acrobat, which just requests the chunks of the file it needs to render the current page. It's a bit involved, but see this for an example.

Fastest way possible to read contents of a file

If you want to load the full-content of a file to a PHP variable, the easiest (and, probably fastest) way would be file_get_contents.

But, if you are working with big files, loading the whole file into memory might not be such a good idea : you'll probably end up with a memory_limit error, as PHP will not allow your script to use more than (usually) a couple mega-bytes of memory.



So, even if it's not the fastest solution, reading the file line by line (fopen+fgets+fclose), and working with those lines on the fly, without loading the whole file into memory, might be necessary...

Most efficient way to serve files through a PHP server from Google Cloud Storage?

Using downloadAsStream() and stream_copy_to_stream() should give you the best results. Eg:

if ($image->exists()) {
header("Content-Type: image/jpeg");
stream_copy_to_stream($image->downloadAsStream(), STDOUT);
}

In this case PHP should copy file data directly to output as it is retrieved with maximal efficiency, memory or otherwise.

If you scroll up a bit in the source downloadAsString() just calls the stream and returns the whole thing as a string, which will cram all that file data into memory.

Also, for best results, grab the StorageObject metadata and set a Content-Length: header. This generally makes HTTP clients behave a bit better. I believe this will be in what's returned by $image->info().

Serving downloadable files with php

1 Since you have to process the file with this script it require more
resources than just normal download link. However this is depend on your needs. If you think these files need more security. Let's say only authenticated users can download the file and only the file belongs to him. Then you need to validate those. In such situation you need the code you have put in your question. If your files are open to the public then you can display direct link to the file may be locating them somewhere in public temporarily.

2 I can suggest you two methods to perform this.

Method 1 :

You need javascript support to perform this kind of requirement in handy way. Assume you need to display some HTML on the page where the download is possible. You can create a page with the HTML you want and you can put a download button.

<input type="button" name="cmdDownload" id="cmdDownload" value="Download" onclick="downloadFile('<?php echo $pathToTheFile; ?>');" />

And you can keep hidden iframe to process the download.

<iframe id="downloadFrame" style="display:none"></iframe>

Assume your PHP download page is download.php.

Then you can have a javascript function like this.

<script type="text/javascript">
function downloadFile(filepath)
{
var ifrme = document.getElementById("downloadFrame");
ifrme.src = "download.php?filepath="+filepath;
}
</script>

Method 2:

Other than above method you can use META Refresh as well.

<meta http-equiv="Refresh" content="3;URL=<?php echo $fullHTTPathToYourFile ?>" />

You can have HTML display with this too.

Is there any significant performance hit, for serving file with PHP

The more processes involved, the more performance will suffer. So you can expect some performance hit, but how much you will need to measure and then decide if that is worth it for your auth checks. In my experience, the cost is marginal.

One thing, don't forget scaling performance: when you're tying up your PHP processes streaming files you're reducing the total number of processes available to serve other requests.

If you're worried about scale and performance, do everything you can to serve this content up-stream. For example:

  1. Perform the auth check in PHP, then issue a redirect to a CDN with a sufficiently large keyspace (eg UUID) -- you might have to rotate files in this keyspace periodically if you're worried about people reusing these URL.
  2. Require the auth have been performed already and have the load balancers check the auth tokens against an IdP.

When you implement it in PHP, make sure to use something like readfile with output buffering disabled. Otherwise, you're increasing the size of your web service process by the size of the content, which could cause out of memory exceptions.

What is the best method to serve download files via a download link?

It's generally best to let the webserver handle sending static files.

Use x-sendfile for apache, lighttpd or nginx. You can use php for auth, send the X-Sendfile header, the script will terminate, and the web server will handle the sending of the file. The end user will never know where the file is on the server.

What is an efficient way to serve images (PHP)?

The most obvious efficient way it to forgo PHP and let it be served directly by Apache, with caching. Is there any reason you have to keep track of images in this way you can't do with just parsing access_log's after the fact? And why no caching, I assume a new image gets a new unique id, not a rehashed old one?

That being said: if you do need database retrieval & serve it with php, it is about as efficient as you can get, although be very war of folders named by usernames, a possible security issue. Maybe cache the query result somewhere in memcached or the like, that's about it.

Serve static files without hard coding all file types header information

Turns out it was a simple solve. I just changed the redirect everything to router.php into another line RewriteRule !.(js|css|ico|gif|jpg|png)$ router.php [L]

This checks if the file is not a js or css or etc then it redirects to router.php file or else it would normally serve the file. I think my question is not clear enough for other people to understand so I hope this answer clears things up.

Big shoutout to this link which made everything right: rewrite rule in .htaccess: what does it do



Related Topics



Leave a reply



Submit