Monitor Memory Usage in R

Monitor memory usage in R

R provides memory profiling support, see Section 3.3 of the Writing R Extensions manual :

3.3 Profiling R code for memory use


Measuring memory use in R code is useful either when the code takes
more memory than is conveniently available or when memory allocation
and copying of objects is responsible for slow code. There are three
ways to profile memory use over time in R code. All three require R to
have been compiled with `--enable-memory-profiling', which is not the
default, but is currently used for the Mac OS X and Windows binary
distributions. All can be misleading, for different reasons.

In understanding the memory profiles it is useful to know a little
more about R's memory allocation. Looking at the results of `gc()'
shows a division of memory into `Vcells' used to store the contents of
vectors and `Ncells' used to store everything else, including all the
administrative overhead for vectors such as type and length
information. In fact the vector contents are divided into two pools.
Memory for small vectors (by default 128 bytes or less) is obtained in
large chunks and then parcelled out by R; memory for larger vectors is
obtained directly from the operating system.

and then provides three more sections.

Measure peak memory usage in R

I found what I was looking for in the package peakRAM. From the documentation:

This package makes it easy to monitor the total and peak RAM used so that developers can quickly identify and eliminate RAM hungry code.

mem <- peakRAM({
for(i in 1:5) {
mean(rnorm(1e7))
}
})
mem$Peak_RAM_Used_MiB # 10000486MiB

mem <- peakRAM({
for(i in 1:5) {
mean(rnorm(1e7))
}
})
mem$Peak_RAM_Used_MiB # 10005266MiB <-- almost the same!

Memory usage in R during running a code

gc() will tell you the maximum memory usage. So if you start a new R session, run your code and then use gc() you should find what you need. Alternatives include the profiling functions Rprof and Rprofmem as referenced in @James comment above.

see RAM usage in R-studio before crash?

Check gc() to check how much memory is being used.

And I think R uses all the memory available. However, you can also set the memory limit by memory.limit(size=).

Moreover, I would recommend using Microsoft R Open It boosts the calculations speed as it introduces some parallel processing. Check out other Microsoft Client R and Server R as well.

Memory Usage in R

Memory for deleted objects is not released immediately. R uses a technique called "garbage collection" to reclaim memory for deleted objects. Periodically, it cycles through the list of accessible objects (basically, those that have names and have not been deleted and can therefore be accessed by the user), and "tags" them for retention. The memory for any untagged objects is returned to the operating system after the garbage-collection sweep.

Garbage collection happens automatically, and you don't have any direct control over this process. But you can force a sweep by calling the command gc() from the command line.

Even then, on some operating systems garbage collection might not reclaim memory (as reported by the OS). Older versions of Windows, for example, could increase but not decrease the memory footprint of R. Garbage collection would only make space for new objects in the future, but would not reduce the memory use of R.

How to check the amount of RAM

Given the warnings concerning platform-dependency discussed in the earlier comment, you could for example parse /proc/meminfo on Linux:

$ grep MemFree /proc/meminfo 
MemFree: 573660 kB
$ awk '/MemFree/ {print $2}' /proc/meminfo
565464

You could try the second approach via system(..., intern=TRUE), or even via a pipe connection.

Edit some 5+ years later: In R, and just following what the previous paragraph hinted at:

R> memfree <- as.numeric(system("awk '/MemFree/ {print $2}' /proc/meminfo", 
+ intern=TRUE))
R> memfree
[1] 3342480
R>

How would one check the system memory available using R on a Windows machine?

Maybe one of the below will help ( I am also on Windows Server 2012 R2):

Maybe this would be the most useful:

> system('systeminfo')
#the output is too big to show but you can save into a list and choose the rows you want

Or just use one of the below which are specific about the memory

> system('wmic MemoryChip get BankLabel, Capacity, MemoryType, TypeDetail, Speed')
BankLabel Capacity MemoryType Speed TypeDetail
RAM slot #0 8589934592 2 512
RAM slot #1 4294967296 2 512

Free available memory:

> system('wmic OS get FreePhysicalMemory /Value')
FreePhysicalMemory=8044340

Total available Memory

> system('wmic OS get TotalVisibleMemorySize /Value')
TotalVisibleMemorySize=12582456

Basically you can even run any other cmd command you want that you know it could help you through the system function. R will show the output on the screen and you can then save into a data.frame and use as you want.



Related Topics



Leave a reply



Submit