Read Big Txt File, Out of Memory Exception

Read Big TXT File, Out of Memory Exception

Just use File.ReadLines which returns an IEnumerable<string> and doesn't load all the lines at once to the memory.

foreach (var line in File.ReadLines(_filePath))
{
//Don't put "line" into a list or collection.
//Just make your processing on it.
}

C# Error: OutOfMemoryException - Reading a large text file and replacing from dictionary

If you have very large files you should try MemoryMappedFile which is designed for this purpose(files > 1GB) and enables to read "windows" of a file into memory. But it's not easy to use.

A simple optimization would be to read and replace line by line

int lineNumber = 0;
var _replacementInfo = new Dictionary<int, List<string>>();

using (StreamReader sr = new StreamReader(logF))
{
using (StreamWriter sw = new StreamWriter(logF_Temp))
{
while (!sr.EndOfStream)
{
string line = sr.ReadLine();
lineNumber++;
foreach (var kv in replacements)
{
bool contains = line.Contains(kv.Key);
if (contains)
{
List<string> lineReplaceList;
if (!_replacementInfo.TryGetValue(lineNumber, out lineReplaceList))
lineReplaceList = new List<string>();
lineReplaceList.Add(kv.Key);
_replacementInfo[lineNumber] = lineReplaceList;

line = line.Replace(kv.Key, kv.Value);
}
}
sw.WriteLine(line);
}
}
}

At the end you can use File.Copy(logF_Temp, logF, true); if you want to overwite the old.

System.OutofMemoryException while reading a large text file using C#

If you alter the default buffer size of your BufferedStream, then it should load the larger files for you with greater efficiency. E.g.

using (var bs = new BufferedStream(fileStream, 1024))
{
// Code here.
}

You may be able to get away with simply using a FileStream, specifying a buffer size also, rather than a BufferedStream. See this MSDN blog regarding it for further details.

Out-of-memory error while reading very large text file in vb.net

I don't know if this will fix your problem but don't use peek, change your loop to: (this is C# but you should be able to translate it to VB)

while (_read.ReadLine() != null)
{
count += 1
}

If you need to use the line of text inside the loop instead of just counting lines just modify the code to

while ((lineoftext = _read.ReadLine()) != null)
{
count += 1
//Do something with lineoftext
}

Kind of off topic and kind of cheating, if each line really is 1563 chars long (including the line ending) and the file is pure ASCII (so all chars take up one byte) you could just do (once again C# but you should be able to translate)

long bytesPerLine = 1563;
string inputfile = @"C:\Split\BIGFILE.txt"; //The @ symbol is so we don't have to escape the `\`
long length;

using(FileStream stream = File.Open(inputFile, FileMode.Open)) //This is the C# equivilant of the try/finally to close the stream when done.
{
length = stream.Length;
}

Console.WriteLine("Total Lines in {0}: {1}", inputfile, (length / bytesPerLine ));

OutOfMemory exception thrown while writing large text file

Use a StreamWriter to directly write each line you generate into the textfile. This avoids storing the whole long file in memory first.

using (System.IO.StreamWriter sw = new System.IO.StreamWriter("C:\\Somewhere\\whatever.txt")) 
{
//Generate all the single lines and write them directly into the file
for (int i = 0; i<=10000;i++)
{
sw.WriteLine("This is such a nice line of text. *snort*");
}
}

Java OutOfMemoryError in reading a large text file

Try to use java.nio.MappedByteBuffer.

http://docs.oracle.com/javase/7/docs/api/java/nio/MappedByteBuffer.html

You can map a file's content onto memory without copying it manually. High-level Operating Systems offer memory-mapping and Java has API to utilize the feature.

If my understanding is correct, memory-mapping does not load a file's entire content onto memory (meaning "loaded and unloaded partially as necessary"), so I guess a 10GB file won't eat up your memory.



Related Topics



Leave a reply



Submit