Uploading and Downloading Large Files in ASP.NET Core 3.1

Uploading and Downloading large files in ASP.NET Core 3.1?

If you have files that large, never use byte[] or MemoryStream in your code. Only operate on streams if you download/upload files.

You have a couple of options:

  • If you control both client and server, consider using something like tus. There are both client- and server-implementations for .NET. This would probably the easiest and most robust option.
  • If you upload large files with the HttpClient, simply use the StreamContent class to send them. Again, don't use a MemoryStream as source, but something else like a FileStream.
  • If you download large files with the HttpClient, it is important to specify the HttpCompletionOptions, for example var response = await httpClient.SendAsync(httpRequest, HttpCompletionOption.ResponseHeadersRead). Otherwise, the HttpClient would buffer the entire response in memory. You can then process the response file as a stream via var stream = response.Content.ReadAsStreamAsync().

ASP.NET Core specific advice:

  • If you want to receive files via HTTP POST, you need to increase the request size limit: [RequestSizeLimit(10L * 1024L * 1024L * 1024L)] and [RequestFormLimits(MultipartBodyLengthLimit = 10L * 1024L * 1024L * 1024L)]. In addition, you need to disable the form value binding, otherwise the whole request will be buffered into memory:
   [AttributeUsage(AttributeTargets.Class | AttributeTargets.Method)]
public class DisableFormValueModelBindingAttribute : Attribute, IResourceFilter
{
public void OnResourceExecuting(ResourceExecutingContext context)
{
var factories = context.ValueProviderFactories;
factories.RemoveType<FormValueProviderFactory>();
factories.RemoveType<FormFileValueProviderFactory>();
factories.RemoveType<JQueryFormValueProviderFactory>();
}

public void OnResourceExecuted(ResourceExecutedContext context)
{
}
}
  • To return a file from a controller, simple return it via the File method, which accepts a stream: return File(stream, mimeType, fileName);

A sample controller would look like this (see https://learn.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads?view=aspnetcore-3.1 for the missing helper classes):

private const MaxFileSize = 10L * 1024L * 1024L * 1024L; // 10GB, adjust to your need

[DisableFormValueModelBinding]
[RequestSizeLimit(MaxFileSize)]
[RequestFormLimits(MultipartBodyLengthLimit = MaxFileSize)]
public async Task ReceiveFile()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
throw new BadRequestException("Not a multipart request");

var boundary = MultipartRequestHelper.GetBoundary(MediaTypeHeaderValue.Parse(Request.ContentType));
var reader = new MultipartReader(boundary, Request.Body);

// note: this is for a single file, you could also process multiple files
var section = await reader.ReadNextSectionAsync();

if (section == null)
throw new BadRequestException("No sections in multipart defined");

if (!ContentDispositionHeaderValue.TryParse(section.ContentDisposition, out var contentDisposition))
throw new BadRequestException("No content disposition in multipart defined");

var fileName = contentDisposition.FileNameStar.ToString();
if (string.IsNullOrEmpty(fileName))
{
fileName = contentDisposition.FileName.ToString();
}

if (string.IsNullOrEmpty(fileName))
throw new BadRequestException("No filename defined.");

using var fileStream = section.Body;
await SendFileSomewhere(fileStream);
}

// This should probably not be inside the controller class
private async Task SendFileSomewhere(Stream stream)
{
using var request = new HttpRequestMessage()
{
Method = HttpMethod.Post,
RequestUri = new Uri("YOUR_DESTINATION_URI"),
Content = new StreamContent(stream),
};
using var response = await _httpClient.SendAsync(request);
// TODO check response status etc.
}

In this example, we stream the entire file to another service. In some cases, it would be better to save the file temporarily to the disk.

ASP.NET Core web application - How to upload large files

I've implemented a similar large file controller but using mongoDB GridFS.

In any case, streaming is the way to go for large files because it is fast and lightweight.
And yes, the best option is to save the files on your server storage before you send.
One suggestion is, add some validations to allow specefic extensions and restrict execution permissions.

Back to your questions:

The entire file is read into an IFormFile, which is a C# representation of the file used to process or save the file.

The resources (disk, memory) used by file uploads depend on the number and size of concurrent file uploads. If an app attempts to buffer too many uploads, the site crashes when it runs out of memory or disk space. If the size or frequency of file uploads is exhausting app resources, use streaming.

source 1

The CopyToAsync method enables you to perform resource-intensive I/O operations without blocking the main thread.

source 2

Here you have examples.

Example 1:

using System.IO;
using Microsoft.AspNetCore.Http;
//...

[HttpPost]
[Authorize]
[DisableRequestSizeLimit]
[RequestFormLimits(ValueLengthLimit = int.MaxValue, MultipartBodyLengthLimit = int.MaxValue)]
[Route("upload")]
public async Task<ActionResult> UploadFileAsync(IFormFile file)
{
if (file == null)
return Ok(new { success = false, message = "You have to attach a file" });

var fileName = file.FileName;
// var extension = Path.GetExtension(fileName);

// Add validations here...

var localPath = $"{Path.Combine(System.AppContext.BaseDirectory, "myCustomDir")}\\{fileName}";

// Create dir if not exists
Directory.CreateDirectory(Path.Combine(System.AppContext.BaseDirectory, "myCustomDir"));

using (var stream = new FileStream(localPath, FileMode.Create)){
await file.CopyToAsync(stream);
}

// db.SomeContext.Add(someData);
// await db.SaveChangesAsync();

return Ok(new { success = true, message = "All set", fileName});
}

Example 2 with GridFS:

[HttpPost]
[Authorize]
[DisableRequestSizeLimit]
[RequestFormLimits(ValueLengthLimit = int.MaxValue, MultipartBodyLengthLimit = int.MaxValue)]
[Route("upload")]
public async Task<ActionResult> UploadFileAsync(IFormFile file)
{
if (file == null)
return Ok(new { success = false, message = "You have to attach a file" });

var options = new GridFSUploadOptions
{
Metadata = new BsonDocument("contentType", file.ContentType)
};

using (var reader = new StreamReader(file.OpenReadStream()))
{
var stream = reader.BaseStream;
await mongo.GridFs.UploadFromStreamAsync(file.FileName, stream, options);
}

return Ok(new { success = true, message = "All set"});
}

Implement Downloading Large files 2gb in ASP.NET Core

Thanks for everyone who tried helping me, I have found the solution myself after digging hard in the documentation and re checking answers, combining ideas, etc..
You got two solutions:

1- Make a class that implements Stream as a HugeMemoryStream, Which of course you can find here:

class HugeMemoryStream : System.IO.Stream
{
#region Fields

private const int PAGE_SIZE = 1024000000;
private const int ALLOC_STEP = 1024;

private byte[][] _streamBuffers;

private int _pageCount = 0;
private long _allocatedBytes = 0;

private long _position = 0;
private long _length = 0;

#endregion Fields

#region Internals

private int GetPageCount(long length)
{
int pageCount = (int)(length / PAGE_SIZE) + 1;

if ((length % PAGE_SIZE) == 0)
pageCount--;

return pageCount;
}

private void ExtendPages()
{
if (_streamBuffers == null)
{
_streamBuffers = new byte[ALLOC_STEP][];
}
else
{
byte[][] streamBuffers = new byte[_streamBuffers.Length + ALLOC_STEP][];

Array.Copy(_streamBuffers, streamBuffers, _streamBuffers.Length);

_streamBuffers = streamBuffers;
}

_pageCount = _streamBuffers.Length;
}

private void AllocSpaceIfNeeded(long value)
{
if (value < 0)
throw new InvalidOperationException("AllocSpaceIfNeeded < 0");

if (value == 0)
return;

int currentPageCount = GetPageCount(_allocatedBytes);
int neededPageCount = GetPageCount(value);

while (currentPageCount < neededPageCount)
{
if (currentPageCount == _pageCount)
ExtendPages();

_streamBuffers[currentPageCount++] = new byte[PAGE_SIZE];
}

_allocatedBytes = (long)currentPageCount * PAGE_SIZE;

value = Math.Max(value, _length);

if (_position > (_length = value))
_position = _length;
}

#endregion Internals

#region Stream

public override bool CanRead => true;

public override bool CanSeek => true;

public override bool CanWrite => true;

public override long Length => _length;

public override long Position
{
get { return _position; }
set
{
if (value > _length)
throw new InvalidOperationException("Position > Length");
else if (value < 0)
throw new InvalidOperationException("Position < 0");
else
_position = value;
}
}

public override void Flush() { }

public override int Read(byte[] buffer, int offset, int count)
{
int currentPage = (int)(_position / PAGE_SIZE);
int currentOffset = (int)(_position % PAGE_SIZE);
int currentLength = PAGE_SIZE - currentOffset;

long startPosition = _position;

if (startPosition + count > _length)
count = (int)(_length - startPosition);

while (count != 0 && _position < _length)
{
if (currentLength > count)
currentLength = count;

Array.Copy(_streamBuffers[currentPage++], currentOffset, buffer, offset, currentLength);

offset += currentLength;
_position += currentLength;
count -= currentLength;

currentOffset = 0;
currentLength = PAGE_SIZE;
}

return (int)(_position - startPosition);
}

public override long Seek(long offset, SeekOrigin origin)
{
switch (origin)
{
case SeekOrigin.Begin:
break;

case SeekOrigin.Current:
offset += _position;
break;

case SeekOrigin.End:
offset = _length - offset;
break;

default:
throw new ArgumentOutOfRangeException("origin");
}

return Position = offset;
}

public override void SetLength(long value)
{
if (value < 0)
throw new InvalidOperationException("SetLength < 0");

if (value == 0)
{
_streamBuffers = null;
_allocatedBytes = _position = _length = 0;
_pageCount = 0;
return;
}

int currentPageCount = GetPageCount(_allocatedBytes);
int neededPageCount = GetPageCount(value);

// Removes unused buffers if decreasing stream length
while (currentPageCount > neededPageCount)
_streamBuffers[--currentPageCount] = null;

AllocSpaceIfNeeded(value);

if (_position > (_length = value))
_position = _length;
}

public override void Write(byte[] buffer, int offset, int count)
{
int currentPage = (int)(_position / PAGE_SIZE);
int currentOffset = (int)(_position % PAGE_SIZE);
int currentLength = PAGE_SIZE - currentOffset;

long startPosition = _position;

AllocSpaceIfNeeded(_position + count);

while (count != 0)
{
if (currentLength > count)
currentLength = count;

Array.Copy(buffer, offset, _streamBuffers[currentPage++], currentOffset, currentLength);

offset += currentLength;
_position += currentLength;
count -= currentLength;

currentOffset = 0;
currentLength = PAGE_SIZE;
}
}

#endregion Stream
}

2- or just push the file to the clients by simply returning the file from HDD (this might be slow if you want faster transfer use better storage units or move them to the blazing fast RAM..) through the controller and to the view.
using this specific return type:

return new PhysicalFileResult("Directory Containing File", 
"application/octet-stream")
{ FileDownloadName = "Your file name + extension, for example: test.txt or test.zip etc.." };

It was pain in the ass but worth it since no one is really answering this question online :)

asp.net core 3.1 streaming large file upload - can't prevent model binding

Firstly, thank you to @Brando Zhang for getting me back on track.

The problem, the 400 (Bad Request), was due to DocumentsController attempting to verify an anti-forgery token that didn't exist - which was down to me unsuccessfully porting the Razor example into MVC.

In the example code, in startup.c, the following section is only applicable to Razor pages - not MVC:

.AddRazorPagesOptions(options =>
{
options.Conventions.AddPageApplicationModelConvention(
"/Documents/Create",
model =>
{
model.Filters.Add(new GenerateAntiForgeryTokenCookieAttribute());
model.Filters.Add(new DisableFormValueModelBindingAttribute());
}
);
});

Therefore, that could be removed. However, now there is not anti-forgery token being generated, so when a POST is made to Create() in DocumentsController, no token is present, so an HTTP 400 is returned.

The solution is a simple one (although it took me long enough to find it!): add [GenerateAntiForgeryTokenCookie] to the GET method that presents the form in DocumentsController:

[HttpGet]
[GenerateAntiForgeryTokenCookie]
public IActionResult Index()
{
return View("Create");
}

[HttpPost]
[DisableFormValueModelBinding]
[ValidateAntiForgeryToken]
public async Task<IActionResult> Create()
{
if (!MultipartRequestHelper.IsMultipartContentType(Request.ContentType))
...
}

Hope it helps!

Uploading a file larger than 100 MB with .NET Core 3.1 will result in 400 (Bad Request)

There is a limit on the size of files that the multipart form body allows for upload.

In UploadController.cs you need to change that limit as well in order to make it work:

Something like:

[HttpPost]
[RequestFormLimits(MultipartBodyLengthLimit = 1073741824)]


Related Topics



Leave a reply



Submit