Large File Upload Though HTML Form (More Than 2 Gb)

Large file upload though html form (more than 2 GB)

The limitation of the size of HTTP POST requests is usually not in the HTML side at all. The limitation is more in the server side. The webserver needs to be configured to accept that large POST requests. The default is usually indeed often 2GB and the server will usually return a HTTP 500 error on that. The default limit can often be increased to 4GB, but anything beyond that will hit the border on 32bit systems. On 64bit systems with a 64bit OS, the theoretical border is much higher, 16EB.

If configuring the webserver to accept that large POST requests is not an option, or when you want to go beyond the webserver's limit, then you have no other option than splitting the file in the client side and reassembling the parts in the server side.

Since HTML is just a markup language, it offers no facilities for splitting the file. You really have to use a normal programming language like C# (Silverlight) or Java (Applet) in flavor of a small application which you serve by your webpage. Very maybe it's also possible with Flash or Flex, but don't pin me on that since I do neither.

Said that, FTP is a much better choice than HTTP for transferring (large) files over network. I'd reconsider the choice of using HTTP for that.

Large file upload through Browser (100 GB)

You can use the JavaScript Blob object to slice large files into smaller chunks and transfer these to the server to be merged together. This has the added benefit of being able to pause/resume downloads and indicate progress.

If you don't fancy doing it yourself there are existing solutions that use this approach. One example is HTML5 Uploader by Filkor.

Allow more than 2 gb file upload in struts2

You should migrate to the latest version of Struts2.

From 2.3.20 and above, a new MulitpartRequest implementation can be used to upload large files:

Alternate Libraries


The struts.multipart.parser used by the fileUpload interceptor to
handle HTTP POST requests, encoded using the MIME-type
multipart/form-data, can be changed out. Currently there are two
choices, jakarta and pell. The jakarta parser is a standard part of
the Struts 2 framework needing only its required libraries added to a
project. The pell parser uses Jason Pell's multipart parser instead of
the Commons-FileUpload library. The pell parser is a Struts 2 plugin,
for more details see:
http://cwiki.apache.org/S2PLUGINS/pell-multipart-plugin.html. There
was a third alternative, cos, but it was removed due to licensing
incompatibilities.

As from Struts version 2.3.18 a new implementation of MultiPartRequest
was added - JakartaStreamMultiPartRequest. It can be used to handle
large files, see WW-3025 for more details, but you can simple set

<constant name="struts.multipart.parser" value="jakarta-stream" />

> in struts.xml to start using it.

What is the best approach to handle large file uploads in a rails app?

I've dealt with this issue on several sites, using a few of the techniques you've illustrated above and a few that you haven't. The good news is that it is actually pretty realistic to allow massive uploads.

A lot of this depends on what you actually plan to do with the file after you have uploaded it... The more work you have to do on the file, the closer you are going to want it to your server. If you need to do immediate processing on the upload, you probably want to do a pure rails solution. If you don't need to do any processing, or it is not time-critical, you can start to consider "hybrid" solutions...

Believe it or not, I've actually had pretty good luck just using mod_porter. Mod_porter makes apache do a bunch of the work that your app would normally do. It helps not tie up a thread and a bunch of memory during the upload. It results in a file local to your app, for easy processing. If you pay attention to the way you are processing the uploaded files (think streams), you can make the whole process use very little memory, even for what would traditionally be fairly expensive operations. This approach requires very little actual setup to your app to get working, and no real modification to your code, but it does require a particular environment (apache server), as well as the ability to configure it.

I've also had good luck using jQuery-File-Upload, which supports good stuff like chunked and resumable uploads. Without something like mod_porter, this can still tie up an entire thread of execution during upload, but it should be decent on memory, if done right. This also results in a file that is "close" and, as a result, easy to process. This approach will require adjustments to your view layer to implement, and will not work in all browsers.

You mentioned FTP and bittorrent as possible options. These are not as bad of options as you might think, as you can still get the files pretty close to the server. They are not even mutually exclusive, which is nice, because (as you pointed out) they do require an additional client that may or may not be present on the uploading machine. The way this works is, basically, you set up an area for them to dump to that is visible by your app. Then, if you need to do any processing, you run a cron job (or whatever) to monitor that location for uploads and trigger your servers processing method. This does not get you the immediate response the methods above can provide, but you can set the interval to be small enough to get pretty close. The only real advantage to this method is that the protocols used are better suited to transferring large files, the additional client requirement and fragmented process usually outweigh any benefits from that, in my experience.

If you don't need any processing at all, your best bet may be to simply go straight to S3 with them. This solution falls down the second you actually need to do anything with the files other than server them as static assets....

I do not have any experience using the HTML5 FileSystemAPI in a rails app, so I can't speak to that point, although it seems that it would significantly limit the clients you are able to support.

Unfortunately, there is not one real silver bullet - all of these options need to be weighed against your environment in the context of what you are trying to accomplish. You may not be able to configure your web server or permanently write to your local file system, for example. For what it's worth, I think jQuery-File-Upload is probably your best bet in most environments, as it only really requires modification to your application, so you could move an implementation to another environment most easily.

Browser, upload large file

There are several ways to handle this,

1. Flash Uploader

Theres plenty of flash uploaders to improve the users GUI so that they can examine the process and the process factors such as time left, KB Done etc.

This is very good if you understand how to improve Flash source code for later developments.

2. Ajax

Theres a few ways using Ajax and PHP (although PHP Does not support it) you can use Perl module to accomplish the same thing http://pecl.php.net/package/uploadprogress, This is only if you wish to show percentage information etc.

3 Basic Javascript.

This method would be just the regular form, but with some ajax styling so when the form is submitted you can show a basic loader saying please wait while you send us the file...

If your using asp, you can take a look at: http://neatupload.codeplex.com/

Hope theres some good information to get you on your way.

Regards



Related Topics



Leave a reply



Submit