C# Console Application Icon

what is the max limit of data into liststring in c#?

The maximum number of elements that can be stored in the current implementation of List<T> is, theoretically, Int32.MaxValue - just over 2 billion.

In the current Microsoft implementation of the CLR there's a 2GB maximum object size limit. (It's possible that other implementations, for example Mono, don't have this restriction.)

Your particular list contains strings, which are reference types. The size of a reference will be 4 or 8 bytes, depending on whether you're running on a 32-bit or 64-bit system. This means that the practical limit to the number of strings you could store will be roughly 536 million on 32-bit or 268 million on 64-bit.

In practice, you'll most likely run out of allocable memory before you reach those limits, especially if you're running on a 32-bit system.

List size limitation in C#

A List<int> is backed by an int[]. You will fail as soon as a larger backing array cannot be allocated - and bear in mind that:

  • There's a 2GB per-object limit in the CLR even in 64 bits (EDIT: as of .NET 4.5, this can be avoided for the 64-bit CLR - see <gcAllowVeryLargeObjects>)
  • The list will try to allocate a backing array which is larger than what it immediately requires, in order to accommodate later Add requests without reallocation.
  • During the reallocation, there has to be enough total memory for both the old and the new arrays.

Setting the Capacity to a value which will put the backing array near the theoretical limit may get you a higher cutoff point than the natural growth, but that limit will certainly come.

I would expect a limit of around 229 elements (536,870,912) - I'm slightly surprised you haven't managed to get beyond 134,217,728. How much memory do you actually have? What version of .NET are you using, and on what architecture? (It's possible that the per-object limit is 1GB for a 32-bit CLR, I can't remember for sure.)

Note that even if the per-object limit wasn't a problem, as soon as you got above 231 elements you'd have problems addressing those elements directly with List<T>, as the indexer takes an int value.

Basically, if you want a collection with more than int.MaxValue elements, you'll need to write your own, probably using multiple backing arrays. You might want to explicitly prohibit removals and arbitrary insertions :)

What's the max items in a ListT?

List<T> will be limited to the max of an array, which is 2GB (even in x64). If that isn't enough, you're using the wrong type of data storage. You can save a lot of overhead by starting it the right size, though - by passing an int to the constructor.

Re your edit - with 134217728 x Int32, that is 512MB. Remember that List<T> uses a doubling algorithm; if you are drip-feeding items via Add (without allocating all the space first) it is going to try to double to 1GB (on top of the 512MB you're already holding, the rest of your app, and of course the CLR runtime and libraries). I'm assuming you're on x86, so you already have a 2GB limit per process, and it is likely that you have fragmented your "large object heap" to death while adding items.

Personally, yes, it sounds about right to start getting an out-of-memory at this point.


Edit: in .NET 4.5, arrays larger than 2GB are allowed if the <gcAllowVeryLargeObjects> switch is enabled. The limit then is 2^31 items. This might be useful for arrays of references (8 bytes each in x64), or an array of large structs.

What is the Maximum Size that an Array can hold?

System.Int32.MaxValue

Assuming you mean System.Array, ie. any normally defined array (int[], etc). This is the maximum number of values the array can hold. The size of each value is only limited by the amount of memory or virtual memory available to hold them.

This limit is enforced because System.Array uses an Int32 as it's indexer, hence only valid values for an Int32 can be used. On top of this, only positive values (ie, >= 0) may be used. This means the absolute maximum upper bound on the size of an array is the absolute maximum upper bound on values for an Int32, which is available in Int32.MaxValue and is equivalent to 2^31, or roughly 2 billion.

On a completely different note, if you're worrying about this, it's likely you're using alot of data, either correctly or incorrectly. In this case, I'd look into using a List<T> instead of an array, so that you are only using as much memory as needed. Infact, I'd recommend using a List<T> or another of the generic collection types all the time. This means that only as much memory as you are actually using will be allocated, but you can use it like you would a normal array.

The other collection of note is Dictionary<int, T> which you can use like a normal array too, but will only be populated sparsely. For instance, in the following code, only one element will be created, instead of the 1000 that an array would create:

Dictionary<int, string> foo = new Dictionary<int, string>();
foo[1000] = "Hello world!";
Console.WriteLine(foo[1000]);

Using Dictionary also lets you control the type of the indexer, and allows you to use negative values. For the absolute maximal sized sparse array you could use a Dictionary<ulong, T>, which will provide more potential elements than you could possible think about.

List vs. Dictionary (Maximum Size, Number of Elements)

Is it specified in the documentation for the class? No, then it's unspecified.

In terms of current implementations, there's no maximum size in RAM in the classes themselves, if you create a value type that's 2MB in size, push a few thousand into a list, and receive an out of memory exception, that's nothing to do with List<T>.

Internally, List<T>s workings would prevent it from ever having more than 2billion items. It's harder to come to a quick answer with Dictionary<TKey, TValue>, since the way things are positioned within it is more complicated, but really, if I was looking at dealing with a billion items (if a 32-bit value, for example, then 4GB), I'd be looking to store them in a database and retrieve them using data-access code.

At the very least, once you're dealing with a single data structure that's 4GB in size, rolling your own custom collection class no longer counts as reinventing the wheel.

Maximum size of string array in C#

Array Class

By default, the maximum size of an Array is 2 gigabytes (GB). In a
64-bit environment, you can avoid the size restriction by setting the
enabled attribute of the gcAllowVeryLargeObjects
configuration element
to true in the run-time environment. However, the array will still be
limited to a total of 4 billion elements
, and to a maximum index of
0X7FEFFFFF in any given dimension (0X7FFFFFC7 for byte arrays and
arrays of single-byte structures).

Very useful comment by Ňuf

But is should be noted that strings themself do not count towards the
2GB size limit, because the array contains only references to these
strings. So the maximal number of elements in string array is approx.
500M in 32bit process and 2G in 64bit process. Also this limit only
applies to .NET CLR, other implementations may have different limits
(e.g. Mono on 64bit supports even larger arrays with
–enable-big-arrays option)

.NET object size limit

.NET limits any object to max 2 GB even on 64 bit platforms. You can create your own data type, that uses multiple objects to store more data, thus getting around the 2 GB limit of a single object. For instance a List<float[]> would allow you to store more than 2 GB, but you would have to write the necessary plumbing code to make it behave similar to a single, large array.

You may also want to check this question.

Is there a limit of elements that could be stored in a List?

The current implementation of List<T> uses Int32 everywhere - to construct its backing array, for its Count property, as an indexer and for all its internal operations - so there's a current theoretical maximum of Int32.MaxValue items (2^31-1 or 2147483647).

But the .NET framework also has a maximum object size limit of 2GB, so you'll only get anywhere near the items limit with lists of single-byte items such as List<byte> or List<bool>.

In practice you'll probably run out of contiguous memory before you hit either of those limits.

What is the maximum length of an array in .NET on 64-bit Windows

An array could theoretically have at most 2,147,483,647 elements, since it uses an int for indexing. The actual limit is slightly lower than this, depending on the type contained within the array.

However, there is a 2GB maximum single object restriction in the .NET CLR, even in 64bit. This was done by design.

You can easily make an IList<T> implementation that, internally, keeps multiple arrays, and allows you to grow beyond the 2GB single object limit, but there is not one in the framework itself.

Typically, however, this is not a real problem. Most of the time, you'll have arrays pointing to large classes - so the array is just holding references. This would mean your array can effectively point to many, many GBs of memory - but the array itself cannot be >2GB.


Note that, as of .NET 4.5, there is a new option available where 64bit applications can opt-in: gcAllowVeryLargeObjects. With this new option set, it is possible to get UInt32.MaxValue (4,294,967,295) elements in a multi-dimensional array, though a single dimensional array is still limited to 2,146,435,071 elements (2,147,483,591 for single byte arrays or arrays of a struct containing nothing ut a byte).

The new rules with this option are:

  • The maximum number of elements in an array is UInt32.MaxValue.
  • The maximum index in any single dimension is 2,147,483,591 (0x7FFFFFC7) for byte arrays and arrays of single-byte structures, and 2,146,435,071 (0X7FEFFFFF) for other types.
  • The maximum size for strings and other non-array objects is unchanged.

Maximum capacity of CollectionT different than expected for x86

The underlying storage for a List<T> class is a T[] array. A hard requirement for an array is that the process must be able to allocate a contiguous chunk of memory to store the array.

That's a problem in a 32-bit process. Virtual memory is used for code and data, you allocate from the holes that are left between them. And while a 32-bit process will have 2 gigabytes of memory, you'll never get anywhere near a hole that's close to that size. The biggest hole in the address space you can get, right after you started the program, is around 500 or 600 megabytes. Give or take, it depends a lot on what DLLs get loaded into the process. Not just the CLR, the jitter and the native images of the framework assemblies but also the kind that have nothing to do with managed code. Like anti-malware and the raft of "helpful" utilities that worm themselves into every process like Dropbox and shell extensions. A poorly based one can cut a nice big hole in two small ones.

These holes will also get smaller as the program has been allocating and releasing memory for a while. A general problem called address space fragmentation. A long-running process can fail on a 90 MB allocation, even though there is lots of unused memory laying around.

You can use SysInternals' VMMap utility to get more insight. A copy of Russinovich's book Windows Internals is typically necessary as well to make sense of what you see.



Related Topics



Leave a reply



Submit