Outofmemoryexception on Declaration of Large Array

OutOfMemoryException on declaration of Large Array

Each double is 8 bytes, so you're trying to allocate a single array with just over 5GB. The CLR has a per-object limit of around 2GB IIRC, even for a 64-bit CLR. In other words, it's not the total amount of memory available that's the problem (although obviously you'll have issues if you don't have enough memory), but the per-object size.

I suggest you split it into smaller arrays, perhaps behind a facade of some description. I don't believe there's any way to workaround that limit for a single array.

EDIT: You could go for an array of arrays - aka a jagged array:

double[][] array = new double[12000][];
for (int i = 0; i < array.Length; i++)
{
array[i] = new double[55000];
}

Would that be acceptable to you?

(You can't use a rectangular array (double[,]) as that would have the same per-object size problem.)

Jagged array OutOfMemoryException

If it is 32-bit application, You are rightly getting OutOfMemoryException. For this size requirements you need to target X64.

At i = 49000,
Total memory = 49000*4600*8 = 1803200000 bytes = ~1.68GB.

Now For 32-bit applications (targeted X86), Total User Memory avaiable to an application is 2GB (unless the application is Large address aware, .NET application - Large Address Aware and OS is also enabled for this. Ex: (for)Vista. Then there is some CLR overhead, then application overhead.

At i = 120000, You need total memory as
Total memory = 120000*4600*8 = 1803200000 bytes = ~4.11GB. (Platform target should be X64)

Why does creating a new array throw OutOfMemoryException?

Turns out this happens because there is a hardcoded memory limit for any object created inside a managed .NET application:

When you run a 64-bit managed application on a 64-bit Windows
operating system, you can create an object of no more than 2 gigabytes
(GB).

 

See also

  • Single objects still limited to 2 GB in size in CLR 4.0?

  • Memory limitations in a 64-bit .Net application?

System.OutOfMemoryException is thrown while still having free VM space

there were chunks up to 14Mb free. The operation which was failing was creating an object less than 1KB total

14 MB is definitely close to the danger-zone. The way the GC allocates VM has nothing to do with the object size. The GC heap is created from chunks of VM called "segments". The segment size is 2 MB when a program starts out but the GC dynamically grows the allocation of new segments when the program uses a lot of memory. Clearly you use a lot of memory. There isn't anything you can do to affect the VM allocation or avoid VM address space fragmentation.

Clearly you are way too close to the VM limit of a 32-bit process. You'll need to either drastically revise your code so you can make do with less. Or you need to put a 64-bit operating system on your list of prerequisites. Which can provide 4 gigabytes of VM address space to a 32-bit process. You'll need an extra build step to take advantage of it, described in this answer.

Unexpected OutOfMemoryError when allocating an array larger than the heap

The documentation cited, Understand the OutOfMemoryException,

Exception in thread thread_name: java.lang.OutOfMemoryError: Requested array size exceeds VM limit

Cause: The detail message "Requested array size exceeds VM limit" indicates that the application (or APIs used by that application) attempted to allocate an array that is larger than the heap size. For example, if an application attempts to allocate an array of 512 MB, but the maximum heap size is 256 MB, then OutOfMemoryError will be thrown with the reason “Requested array size exceeds VM limit."

Action: Usually the problem is either a configuration issue (heap size too small) or a bug that results in an application attempting to create a huge array (for example, when the number of elements in the array is computed using an algorithm that computes an incorrect size).

...is incorrect.

What "requested array size exceeds VM limit" message really means is that there is an limit imposed by the implementation of the JVM. Exceeding this limit will always cause the failure, no matter how much heap space is available. An OutOfMemoryError with this message occurs if an there is an attempt to allocate an array with close to Integer.MAX_VALUE elements. (This is not a fixed limit. See below.)

By contrast, an OutOfMemoryError with the "Java heap space" message means that the request could have been fulfilled if circumstances were different: for example, if less memory were used by other objects, or if the JVM's maximum heap size were increased.

I've repurposed bug JDK-8254804 to fix the documentation in question.

There is a comment that seems relevant in ArraysSupport.java:

/**
* The maximum length of array to allocate (unless necessary).
* Some VMs reserve some header words in an array.
* Attempts to allocate larger arrays may result in
* {@code OutOfMemoryError: Requested array size exceeds VM limit}
*/
public static final int MAX_ARRAY_LENGTH = Integer.MAX_VALUE - 8;

Note that this is in library code, not in the Hotspot JVM code. It's the library code's conservative guess for an array size that doesn't exceed the limit that might be imposed by the JVM. The problem is that the JVM's maximum array size limit might vary depending on different circumstances, such as the garbage collector in use, whether the JVM is running in 32-bit or 64-bit mode, or even the JVM (Hotspot or other) upon which it is running. The library needs to be a bit conservative here because it needs to run on different JVMs in different circumstances, and there's no standardized interface to the JVM that returns the "maximum allowed array size." (Indeed, such a concept is potentially ill-defined, as the maximum array size might differ for different array types, or it might change from one moment to the next.)

OutOfMemoryException thrown while serializing list of objects that contain a large byte array

k; this was was some unfortunate timing - basically, it was only checking whether it should flush whenever the buffer got full, and as a consequence of being in the middle of writing a length-prefixed item, it was never able to properly flush at that point. I've added a tweak so that whenever it finds it reaches a flushable state, and there is something worth flushing (currently 1024 bytes), then it will flush more aggressively. This has been committed as r597. With that patch, it now works as expected.

In the interim, there is a way of avoiding this glitch without changing version: iterate over the data at source, serializing each individually with SerializeWithLengthPrefix specifying prefix-style base-128, and field-number 1; this is 100% identical in terms of what goes over the wire, but has a separate serialization cycle for each:

using (var f = File.Create("Data.protobuf"))
{
foreach(var obj in GenerateData(1000000))
{
Serializer.SerializeWithLengthPrefix<DTO>(
f, obj, PrefixStyle.Base128, Serializer.ListItemTag);
}
}

Thanks for noticing ;p

Large Array - Out of Memory Error

Each device has a maximum per-app RAM cap. If this call in the manifest does not alleviate your problem:

android:largeHeap="true"

Then your only other option is to write your code using the NDK. But that's a pretty hefty thing to dive into, so I would try to figure out an alternative first.

Jagged array OutOfMemoryException

If it is 32-bit application, You are rightly getting OutOfMemoryException. For this size requirements you need to target X64.

At i = 49000,
Total memory = 49000*4600*8 = 1803200000 bytes = ~1.68GB.

Now For 32-bit applications (targeted X86), Total User Memory avaiable to an application is 2GB (unless the application is Large address aware, .NET application - Large Address Aware and OS is also enabled for this. Ex: (for)Vista. Then there is some CLR overhead, then application overhead.

At i = 120000, You need total memory as
Total memory = 120000*4600*8 = 1803200000 bytes = ~4.11GB. (Platform target should be X64)



Related Topics



Leave a reply



Submit