R memory management / cannot allocate vector of size n Mb
Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix
package for e.g.) for sparse matrices.
Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc()
to clear now unused memory, or, better only create the object you need in one session.
If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.
If you cannot do that there are many online services for remote computing.
If you cannot do that the memory-mapping tools like package ff
(or bigmemory
as Sascha mentions) will help you build a new solution. In my limited experience ff
is the more advanced package, but you should read the High Performance Computing
topic on CRAN Task Views.
Memory Allocation Error: cannot allocate vector of size 75.1 Mb
R has gotten to the point where the OS cannot allocate it another 75.1Mb chunk of RAM. That is the size of memory chunk required to do the next sub-operation. It is not a statement about the amount of contiguous RAM required to complete the entire process. By this point, all your available RAM is exhausted but you need more memory to continue and the OS is unable to make more RAM available to R.
Potential solutions to this are manifold. The obvious one is get hold of a 64-bit machine with more RAM. I forget the details but IIRC on 32-bit Windows, any single process can only use a limited amount of RAM (2GB?) and regardless Windows will retain a chunk of memory for itself, so the RAM available to R will be somewhat less than the 3.4Gb you have. On 64-bit Windows R will be able to use more RAM and the maximum amount of RAM you can fit/install will be increased.
If that is not possible, then consider an alternative approach; perhaps do your simulations in batches with the n per batch much smaller than N
. That way you can draw a much smaller number of simulations, do whatever you wanted, collect results, then repeat this process until you have done sufficient simulations. You don't show what N
is, but I suspect it is big, so try smaller N
a number of times to give you N
over-all.
R memory management / cannot allocate vector of size n Mb
Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix
package for e.g.) for sparse matrices.
Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc()
to clear now unused memory, or, better only create the object you need in one session.
If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.
If you cannot do that there are many online services for remote computing.
If you cannot do that the memory-mapping tools like package ff
(or bigmemory
as Sascha mentions) will help you build a new solution. In my limited experience ff
is the more advanced package, but you should read the High Performance Computing
topic on CRAN Task Views.
Related Topics
R - Test If a String Vector Contains Any Element of Another List
How to Show Code But Hide Output in Rmarkdown
Conditional Replacement of a Comma With a Dot in a Numeric Column
Extract Rows for the First Occurrence of a Variable in a Data Frame
Order Bars in Ggplot2 Bar Graph
Understanding Exactly When a Data.Table Is a Reference to (Vs a Copy Of) Another Data.Table
How to Use a Variable to Specify Column Name in Ggplot
Convert Row Names into First Column
Reorder Levels of a Factor Without Changing Order of Values
Storing Ggplot Objects in a List from Within Loop in R
Concatenating Two Text Columns in Dplyr
R: How to Get the Percentage Change from Two Different Columns
How to Remove the Negative Values from a Data Frame in R
Choose the Top Five Values from Each Group in R
Transpose/Reshape Dataframe Without "Timevar" from Long to Wide Format
How to Implement Coalesce Efficiently in R