Parallel.Foreach Can Cause a "Out of Memory" Exception If Working with a Enumerable with a Large Object

Parallel.ForEach can cause a Out Of Memory exception if working with a enumerable with a large object

The default options for Parallel.ForEach only work well when the task is CPU-bound and scales linearly. When the task is CPU-bound, everything works perfectly. If you have a quad-core and no other processes running, then Parallel.ForEach uses all four processors. If you have a quad-core and some other process on your computer is using one full CPU, then Parallel.ForEach uses roughly three processors.

But if the task is not CPU-bound, then Parallel.ForEach keeps starting tasks, trying hard to keep all CPUs busy. Yet no matter how many tasks are running in parallel, there is always more unused CPU horsepower and so it keeps creating tasks.

How can you tell if your task is CPU-bound? Hopefully just by inspecting it. If you are factoring prime numbers, it is obvious. But other cases are not so obvious. The empirical way to tell if your task is CPU-bound is to limit the maximum degree of parallelism with ParallelOptions.MaximumDegreeOfParallelism and observe how your program behaves. If your task is CPU-bound then you should see a pattern like this on a quad-core system:

  • ParallelOptions.MaximumDegreeOfParallelism = 1: use one full CPU or 25% CPU utilization
  • ParallelOptions.MaximumDegreeOfParallelism = 2: use two CPUs or 50% CPU utilization
  • ParallelOptions.MaximumDegreeOfParallelism = 4: use all CPUs or 100% CPU utilization

If it behaves like this then you can use the default Parallel.ForEach options and get good results. Linear CPU utilization means good task scheduling.

But if I run your sample application on my Intel i7, I get about 20% CPU utilization no matter what maximum degree of parallelism I set. Why is this? So much memory is being allocated that the garbage collector is blocking threads. The application is resource-bound and the resource is memory.

Likewise an I/O-bound task that performs long running queries against a database server will also never be able to effectively utilize all the CPU resources available on the local computer. And in cases like that the task scheduler is unable to "know when to stop" starting new tasks.

If your task is not CPU-bound or the CPU utilization doesn't scale linearly with the maximum degree of parallelism, then you should advise Parallel.ForEach not to start too many tasks at once. The simplest way is to specify a number that permits some parallelism for overlapping I/O-bound tasks, but not so much that you overwhelm the local computer's demand for resources or overtax any remote servers. Trial and error is involved to get the best results:

static void Main(string[] args)
{
Parallel.ForEach(CreateData(),
new ParallelOptions { MaxDegreeOfParallelism = 4 },
(data) =>
{
data[0] = 1;
});
}

Out of Memory Exception in Parallel.ForEach

Try replacing:

  Parallel.ForEach(flist, (item) =>
{
string f1 = item.Split('|')[0];
string f2 = item.Split('|')[1];
a = File.ReadAllText(f1);
b = File.ReadAllText(f2);
Consume(a, b);
});

With:

    Parallel.ForEach(flist, 
new ParallelOptions { MaxDegreeOfParallelism = 4 },
(item) =>
{
string f1 = item.Split('|')[0];
string f2 = item.Split('|')[1];
a = File.ReadAllText(f1);
b = File.ReadAllText(f2);
Consume(a, b);
});

This will prevent too many threads being created. Then you can always experiment with higher numbers and see if performance improves.

IEnumerableT, Parallel.ForEach and Memory Management

I do not fully understand how Parallel.ForEach is pulling items, but I think by default it pulls more than one to save locking overhead. This means that multiple items might be queued internally inside of Parallel.ForEach. This might cause OOM quickly because your items are very big individually.

You could try giving it a Partitioner that returns single items.

If that does not help, we need to dig deeper. Debugging memory issues with Parallel and PLINQ is nasty. There was in bug in one of those, for example, that caused old items not to be released quickly.

As a workaround, you could clear the list after processing. That will at least allow all items to be reclaimed deterministically after processing has been done.

Regarding the code you posted: It is clean, of high quality and you are adhering to high standards of resource management. I would not suspect a gross memory or resource leak on your part. It is still not impossible. You can test this by commenting out the code inside of the Parallel.ForEach and replacing it with a Thread.Sleep(1000 * 60). If the leak persists, you are not at fault.

In my experience, PLINQ is easier to get an exact degree of parallelism with (because the current version uses the exact DOP you specify, never less never more). Like this:

GetRows()
.AsBatches(10000)
.AsParallel().WithDegreeOfParallelism(8)
.Select(TransformItems) //generate rows to write
.AsEnumerable() //leave PLINQ
.SelectMany(x => x) //flatten batches
.AsBatches(1000000) //create new batches with different size
.AsParallel().WithDegreeOfParallelism(2) //PLINQ with different DOP
.ForEach(WriteBatchToDB); //write to DB

This would give you a simple pipeline that pulls from the DB, does CPU-bound work with a specific DOP optimized for the CPU, and writes to the database with much bigger batches and less DOP.

This is quite simple and it should max out CPUs and disks independently with their respective DOP. Play with the DOP numbers.

Out of Memory Exception in Parallel.ForEach

Try replacing:

  Parallel.ForEach(flist, (item) =>
{
string f1 = item.Split('|')[0];
string f2 = item.Split('|')[1];
a = File.ReadAllText(f1);
b = File.ReadAllText(f2);
Consume(a, b);
});

With:

    Parallel.ForEach(flist, 
new ParallelOptions { MaxDegreeOfParallelism = 4 },
(item) =>
{
string f1 = item.Split('|')[0];
string f2 = item.Split('|')[1];
a = File.ReadAllText(f1);
b = File.ReadAllText(f2);
Consume(a, b);
});

This will prevent too many threads being created. Then you can always experiment with higher numbers and see if performance improves.

Preventing OutOfMemoryException with GC.Collect()

Limiting the thread count solved the issue:

if (Multithread)
{
ParallelOptions pOptions = new ParallelOptions();
pOptions.MaxDegreeOfParallelism = Environment.ProcessorCount;
Parallel.For(0, FileNames.Length, pOptions, i => Solve(FileNames[i]));
}
else
{
foreach (string s in FileNames)
{
Solve(s);
}
}


Related Topics



Leave a reply



Submit