R Cannot Allocate Memory Though Memory Seems to Be Available

R Cannot allocate memory though memory seems to be available

I'm just posting this because it's too long to fit in the comments. Since you haven't included any code, it's pretty hard to give advice. But, here is some code that maybe you can think about.

wd <- getwd()
assign('.First', function(x) {
require('plyr') #and whatever other packages you're using
file.remove(".RData") #already been loaded
rm(".Last", pos=.GlobalEnv) #otherwise won't be able to quit R without it restarting
setwd(wd)
}, pos=.GlobalEnv)
assign(".Last", function() {
system("R --no-site-file --no-init-file --quiet")
}, pos=.GlobalEnv)
save.image() #or only save the things you want to be reloaded.
q("no")

The idea is that you save the things you need in a file called .RData. You create a .Last function that will be run when you quit R. The .Last function will start a new session of R. And you create a .First function that will be run as soon as R is restarted. The .First function will load packages you need and clean up.

Now, you can quit R and it will restart loading the things you need.

(q("no") means don't save, but you already saved everything you need in .RData which will be loaded when it restarts)

R memory management / cannot allocate vector of size n Mb

Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix package for e.g.) for sparse matrices.

Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc() to clear now unused memory, or, better only create the object you need in one session.

If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.

If you cannot do that there are many online services for remote computing.

If you cannot do that the memory-mapping tools like package ff (or bigmemory as Sascha mentions) will help you build a new solution. In my limited experience ff is the more advanced package, but you should read the High Performance Computing topic on CRAN Task Views.

foreach execution: cannot allocate memory, while RAM still available

Verify the R version you're running is 64 bit. If it isn't, RAM allocation is limited to 4Gb if I remember right. In 64 bit R allocation is limited by memory available.

In RStudio:
Tools > Global Options > General > R version

I've had issues in the past with my machine (24 cores, 128 Gb RAM, 64bit) defaulting to 32 bit R for an unknown reason at seemingly random times. In Global Options I had to manually change the R version to 64 bit to assure it stopped happening.

R multicore mcfork(): Unable to fork: Cannot allocate memory

The issue might be exactly what the error message suggests: there isn't enough memory to fork and create parallel processes.

R essentially needs to create a copy of everything that's in memory for each individual process (to my knowledge it doesn't utilize shared memory). If you are already using 51% of your RAM with a single process, then you don't have enough memory to create a second process since that would required 102% of your RAM in total.

Try:

  1. Using fewer cores - If you were trying to use 4 cores, it's possible you have enough RAM to support 3 parallel threads, but not 4. registerDoMC(2), for example, will set the number of parallel threads to 2 (if you are using the doMC parallel backend).
  2. Using less memory - without seeing the rest of your code, it's hard to suggest ways to accomplish this. One thing that might help is figuring out which R objects are taking up all the memory (Determining memory usage of objects?) and then removing any objects from memory that you don't need (rm(my_big_object))
  3. Adding more RAM - if all else fails, throw hardware at it so you have more capacity.
  4. Sticking to single threading - multithreaded processing in R is a tradeoff of CPU and memory. It sounds like in this case you may not have enough memory to support the CPU power you have, so the best course of action might be to just stick to a single core.


Related Topics



Leave a reply



Submit