Rmarkdown::Render() in a Loop - Cannot Allocate Vector of Size

rmarkdown::render() in a loop - cannot allocate vector of size

After digging around in rmarkdown::render code for a while I've found a solution.

Adding knitr::knit_meta(class=NULL, clean = TRUE) before rmarkdown::render(input=file, etc) seems to do the trick..

R memory management / cannot allocate vector of size n Mb

Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see Matrix package for e.g.) for sparse matrices.

Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use gc() to clear now unused memory, or, better only create the object you need in one session.

If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.

If you cannot do that there are many online services for remote computing.

If you cannot do that the memory-mapping tools like package ff (or bigmemory as Sascha mentions) will help you build a new solution. In my limited experience ff is the more advanced package, but you should read the High Performance Computing topic on CRAN Task Views.

Pandoc error 1033 when rendering multiple Rmarkdown reports

Here is the process that solved the error. The idea comes from @tarleb's comment.
Error 1033 is not a Pandoc error ; it is not referenced here : https://github.com/jgm/pandoc/blob/master/MANUAL.txt#L1384.

And it is not the memory usage problem that can be solved with knitr::knit_meta(class=NULL, clean = TRUE)

As suggested here, update Pandoc can be the solution.

  1. Using rmarkdown::pandoc_version() check the pandoc version currently used to generate the Rmarkdown reports.
  2. Go on this website : https://pandoc.org/releases.html and check if the version you are using is the latest one.

For me it was not the case (I was using v2.6 of Pandoc and last one was v2.7.3)
So I followed the tutorial from this page : https://pandoc.org/installing.html.

Then I checked if rmarkdown::pandoc_version() returned the latest version number and I reran my R script. It solved the problem.

Memory Allocation Error: cannot allocate vector of size 75.1 Mb

R has gotten to the point where the OS cannot allocate it another 75.1Mb chunk of RAM. That is the size of memory chunk required to do the next sub-operation. It is not a statement about the amount of contiguous RAM required to complete the entire process. By this point, all your available RAM is exhausted but you need more memory to continue and the OS is unable to make more RAM available to R.

Potential solutions to this are manifold. The obvious one is get hold of a 64-bit machine with more RAM. I forget the details but IIRC on 32-bit Windows, any single process can only use a limited amount of RAM (2GB?) and regardless Windows will retain a chunk of memory for itself, so the RAM available to R will be somewhat less than the 3.4Gb you have. On 64-bit Windows R will be able to use more RAM and the maximum amount of RAM you can fit/install will be increased.

If that is not possible, then consider an alternative approach; perhaps do your simulations in batches with the n per batch much smaller than N. That way you can draw a much smaller number of simulations, do whatever you wanted, collect results, then repeat this process until you have done sufficient simulations. You don't show what N is, but I suspect it is big, so try smaller N a number of times to give you N over-all.



Related Topics



Leave a reply



Submit