C# Download Big File from Server With Less Memory Consumption

C# Download big file from Server with less memory consumption

A better option would be to use FileResult instead of ActionResult:

Using this method means you don't have to load the file/bytes in memory before serving.

public FileResult Download()
{
var filePath = "file path in server";
return new FilePathResult(Server.MapPath(filePath), "application/zip");
}

Edit: For larger files FilePathResult will also fail.

Your best bet is probably Response.TransmitFile() then. I've used this on larger files (GBs) and had no issues before

public ActionResult Download()
{
var filePath = @"file path from server";

Response.Clear();
Response.ContentType = "application/octet-stream";
Response.AppendHeader("Content-Disposition", "filename=" + filePath);

Response.TransmitFile(filePath);

Response.End();

return Index();
}

From MSDN:

Writes the specified file directly to an HTTP response output stream,
without buffering it in memory.

Filedownload with low memory even with big files

You need to use HttpResponse.TransmitFile method. That will not buffer the file in memory but you still will get all the controls you need.

Download large files using a memorystream

MemoryStream is "safe" to use for large files. However, you will be loading the entire file into memory, and it will remain there until Garbage Collection determines it a good time to recycle that memory.

8GB of RAM is plenty for a "medium" load production server. This is, of course, objective, but if a single low-medium traffic WebApp is using more than 8GB of RAM then some design decisions should be revised.

There are two options to avoid loading the entire remote file into memory:

  1. Write it to a local file on disk, and serve that file back to the WebApp client
  2. Do a chunked write directly to the Response stream

Option 2 could be something like:

//...
using (Stream responseStream = response.GetResponseStream())
{
Response.BufferOutput= false; // to prevent buffering
byte[] buffer = new byte[1024];
int bytesRead = 0;
while ((bytesRead = responseStream.Read(buffer, 0, buffer.Length)) > 0)
{
Response.OutputStream.Write(buffer, 0, bytesRead);
}
}
//...

Implement Downloading Large files > 2gb in ASP.NET Core

Thanks for everyone who tried helping me, I have found the solution myself after digging hard in the documentation and re checking answers, combining ideas, etc..
You got two solutions:

1- Make a class that implements Stream as a HugeMemoryStream, Which of course you can find here:

class HugeMemoryStream : System.IO.Stream
{
#region Fields

private const int PAGE_SIZE = 1024000000;
private const int ALLOC_STEP = 1024;

private byte[][] _streamBuffers;

private int _pageCount = 0;
private long _allocatedBytes = 0;

private long _position = 0;
private long _length = 0;

#endregion Fields

#region Internals

private int GetPageCount(long length)
{
int pageCount = (int)(length / PAGE_SIZE) + 1;

if ((length % PAGE_SIZE) == 0)
pageCount--;

return pageCount;
}

private void ExtendPages()
{
if (_streamBuffers == null)
{
_streamBuffers = new byte[ALLOC_STEP][];
}
else
{
byte[][] streamBuffers = new byte[_streamBuffers.Length + ALLOC_STEP][];

Array.Copy(_streamBuffers, streamBuffers, _streamBuffers.Length);

_streamBuffers = streamBuffers;
}

_pageCount = _streamBuffers.Length;
}

private void AllocSpaceIfNeeded(long value)
{
if (value < 0)
throw new InvalidOperationException("AllocSpaceIfNeeded < 0");

if (value == 0)
return;

int currentPageCount = GetPageCount(_allocatedBytes);
int neededPageCount = GetPageCount(value);

while (currentPageCount < neededPageCount)
{
if (currentPageCount == _pageCount)
ExtendPages();

_streamBuffers[currentPageCount++] = new byte[PAGE_SIZE];
}

_allocatedBytes = (long)currentPageCount * PAGE_SIZE;

value = Math.Max(value, _length);

if (_position > (_length = value))
_position = _length;
}

#endregion Internals

#region Stream

public override bool CanRead => true;

public override bool CanSeek => true;

public override bool CanWrite => true;

public override long Length => _length;

public override long Position
{
get { return _position; }
set
{
if (value > _length)
throw new InvalidOperationException("Position > Length");
else if (value < 0)
throw new InvalidOperationException("Position < 0");
else
_position = value;
}
}

public override void Flush() { }

public override int Read(byte[] buffer, int offset, int count)
{
int currentPage = (int)(_position / PAGE_SIZE);
int currentOffset = (int)(_position % PAGE_SIZE);
int currentLength = PAGE_SIZE - currentOffset;

long startPosition = _position;

if (startPosition + count > _length)
count = (int)(_length - startPosition);

while (count != 0 && _position < _length)
{
if (currentLength > count)
currentLength = count;

Array.Copy(_streamBuffers[currentPage++], currentOffset, buffer, offset, currentLength);

offset += currentLength;
_position += currentLength;
count -= currentLength;

currentOffset = 0;
currentLength = PAGE_SIZE;
}

return (int)(_position - startPosition);
}

public override long Seek(long offset, SeekOrigin origin)
{
switch (origin)
{
case SeekOrigin.Begin:
break;

case SeekOrigin.Current:
offset += _position;
break;

case SeekOrigin.End:
offset = _length - offset;
break;

default:
throw new ArgumentOutOfRangeException("origin");
}

return Position = offset;
}

public override void SetLength(long value)
{
if (value < 0)
throw new InvalidOperationException("SetLength < 0");

if (value == 0)
{
_streamBuffers = null;
_allocatedBytes = _position = _length = 0;
_pageCount = 0;
return;
}

int currentPageCount = GetPageCount(_allocatedBytes);
int neededPageCount = GetPageCount(value);

// Removes unused buffers if decreasing stream length
while (currentPageCount > neededPageCount)
_streamBuffers[--currentPageCount] = null;

AllocSpaceIfNeeded(value);

if (_position > (_length = value))
_position = _length;
}

public override void Write(byte[] buffer, int offset, int count)
{
int currentPage = (int)(_position / PAGE_SIZE);
int currentOffset = (int)(_position % PAGE_SIZE);
int currentLength = PAGE_SIZE - currentOffset;

long startPosition = _position;

AllocSpaceIfNeeded(_position + count);

while (count != 0)
{
if (currentLength > count)
currentLength = count;

Array.Copy(buffer, offset, _streamBuffers[currentPage++], currentOffset, currentLength);

offset += currentLength;
_position += currentLength;
count -= currentLength;

currentOffset = 0;
currentLength = PAGE_SIZE;
}
}

#endregion Stream
}

2- or just push the file to the clients by simply returning the file from HDD (this might be slow if you want faster transfer use better storage units or move them to the blazing fast RAM..) through the controller and to the view.
using this specific return type:

return new PhysicalFileResult("Directory Containing File", 
"application/octet-stream")
{ FileDownloadName = "Your file name + extension, for example: test.txt or test.zip etc.." };

It was pain in the ass but worth it since no one is really answering this question online :)

OutOfMemoryException when send big file 500MB using FileStream ASPNET

I've created download page which allows user to download up to 4gb (may be more) few months ago. Here is my working snippet:

  private void TransmitFile(string fullPath, string outFileName)
{
System.IO.Stream iStream = null;

// Buffer to read 10K bytes in chunk:
byte[] buffer = new Byte[10000];

// Length of the file:
int length;

// Total bytes to read:
long dataToRead;

// Identify the file to download including its path.
string filepath = fullPath;

// Identify the file name.
string filename = System.IO.Path.GetFileName(filepath);

try
{
// Open the file.
iStream = new System.IO.FileStream(filepath, System.IO.FileMode.Open,
System.IO.FileAccess.Read, System.IO.FileShare.Read);


// Total bytes to read:
dataToRead = iStream.Length;

Response.Clear();
Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Disposition", "attachment; filename=" + outFileName);
Response.AddHeader("Content-Length", iStream.Length.ToString());

// Read the bytes.
while (dataToRead > 0)
{
// Verify that the client is connected.
if (Response.IsClientConnected)
{
// Read the data in buffer.
length = iStream.Read(buffer, 0, 10000);

// Write the data to the current output stream.
Response.OutputStream.Write(buffer, 0, length);

// Flush the data to the output.
Response.Flush();

buffer = new Byte[10000];
dataToRead = dataToRead - length;
}
else
{
//prevent infinite loop if user disconnects
dataToRead = -1;
}
}
}
catch (Exception ex)
{
throw new ApplicationException(ex.Message);
}
finally
{
if (iStream != null)
{
//Close the file.
iStream.Close();
}
Response.Close();
}
}

Securing Large Downloads Using C# and IIS 7

You know what? The KB article is poo. Here is my official recommendation:

public void StreamFile(string filePath)
{
string fileName = Path.GetFileName(filePath);

using (var fStream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read))
{
var contentLength = fStream.Length;

if (Request.UserAgent.Contains("MSIE"))
{
Response.AddHeader("Content-Transfer-Encoding", "binary");
}

Response.ContentType = "application/octet-stream";
Response.AddHeader("Content-Length", contentLength.ToString());

// Even though "Content-Disposition" should have an upper-case "d", as per http://www.ietf.org/rfc/rfc2183.txt
// IE fails to recognize this if the "d" is upper-cased.
Response.AddHeader("Content-disposition", "attachment; filename=" + fileName);

var buffer = new byte[8192];

while (Response.IsClientConnected)
{
var count = fStream.Read(buffer, 0, buffer.Length);
if (count == 0)
{
break;
}

Response.OutputStream.Write(buffer, 0, count);
Response.Flush();
}
}

Response.Close();
}

Downloading large files, saved in Database

You can use DATALENGTH to get the size of the VARBINARY and stream it for instance with a SqldataReader and it's Read-or ReadBytes-Method.

Have a look at this answer to see an implementation: Best way to stream files in ASP.NET



Related Topics



Leave a reply



Submit