Alternative to R's 'Memory.Size()' in Linux

Alternative to R's `memory.size()` in linux?

Using pryr library:

library("pryr")

mem_used()
# 27.9 MB

x <- mem_used()
x
# 27.9 MB
class(x)
# [1] "bytes"

Result is the same as @RHertel's answer, with pryr we can assign the result into a variable.

system('grep MemTotal /proc/meminfo')
# MemTotal: 263844272 kB

To assign to a variable with system call, use intern = TRUE:

x <- system('grep MemTotal /proc/meminfo', intern = TRUE)
x
# [1] "MemTotal: 263844272 kB"
class(x)
# [1] "character"

limiting memory usage in R under linux

There's unix::rlimit_as() that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Windows and macOS not supported.

In my .Rprofile I have

unix::rlimit_as(1e12, 1e12)

to limit memory usage to ~12 GB.

Before that...

I had created a small R package, ulimit with similar functionality.

Install it from GitHub using

devtools::install_github("krlmlr/ulimit")

To limit the memory available to R to 2000 MiB, call:

ulimit::memory_limit(2000)

Now:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb

Increasing (or decreasing) the memory available to R processes

From:

http://gking.harvard.edu/zelig/docs/How_do_I2.html (mirror)

Windows users may get the error that R
has run out of memory.

If you have R already installed and
subsequently install more RAM, you may
have to reinstall R in order to take
advantage of the additional capacity.

You may also set the amount of
available memory manually. Close R,
then right-click on your R program
icon (the icon on your desktop or in
your programs directory). Select
``Properties'', and then select the
``Shortcut'' tab. Look for the
``Target'' field and after the closing
quotes around the location of the R
executible, add

--max-mem-size=500M

as shown in the figure below. You may
increase this value up to 2GB or the
maximum amount of physical RAM you
have installed.

If you get the error that R cannot
allocate a vector of length x, close
out of R and add the following line to
the ``Target'' field:

--max-vsize=500M

or as appropriate. You can always
check to see how much memory R has
available by typing at the R prompt

memory.limit()

which gives you the amount of available memory in MB. In previous versions of R you needed to use: round(memory.limit()/2^20, 2).

Tricks to manage the available memory in an R session

To further illustrate the common strategy of frequent restarts, we can use littler which allows us to run simple expressions directly from the command-line. Here is an example I sometimes use to time different BLAS for a simple crossprod.

 r -e'N<-3*10^3; M<-matrix(rnorm(N*N),ncol=N); print(system.time(crossprod(M)))'

Likewise,

 r -lMatrix -e'example(spMatrix)'

loads the Matrix package (via the --packages | -l switch) and runs the examples of the spMatrix function. As r always starts 'fresh', this method is also a good test during package development.

Last but not least r also work great for automated batch mode in scripts using the '#!/usr/bin/r' shebang-header. Rscript is an alternative where littler is unavailable (e.g. on Windows).

memory.limit() bug: memory.limit() is not longer supported . Increasing memory

it seems this isn't a bug. When reading the documentation provided with the Version 4.2.0 (2022-04-22), it says:

  • R on Windows now uses the system memory allocator. Doug Lea's allocator was used since R 1.2.0 to mitigate performance limitations seen with system allocators on earlier versions of Windows.
  • memory.limit() and memory.size() are now stubs on Windows (as on Unix-alikes).

So, I've been told, this means it's just no longer supported...

Memory.limit vs memory.size in R

In a nutshell:

memory.size is the memory in use by R.

memory.limit is the total amount of memory available to R. If only a set amount is allocated, one can use the limit argument to increase the memory size (if possible given system/OS constraints).

These only work on Windows, to be clear.

Which operating system gives R the most memory?

I'm going to talk about Windows in this answer. I know nothing substantial about other operating systems so I'll not embarrass myself by not talking about them. I'm not making any judgements about which of Windows or Linux is better or worse than the other.

Nowadays, you always want to be using the 64 bit version of Windows rather than the 32 bit version. The 64 bit version of Windows runs 32 bit programs perfectly (sometimes even a little faster than the 32 bit version of Windows manages). When you run a 32 bit program under 64 bit Windows, it can get access to 4GB address space. Under 32 bit Windows it only gets a 2GB address space. So even if you stick to 32 bit R you'll have more head-room under 64 bit Windows.

However, there is nowadays a 64 bit version of R. This is less mature than the 32 bit version but it will most likely meet your needs. You may need to check for package compatibility since some packages may not yet support 64 bit R on Windows yet.

You may actually experience a drop-off in performance with the 64 bit version of R because pointer width doubles and so the memory footprint is bigger. However, I'd be surprised if this was at all significant.

You can install both 32 and 64 bit versions side-by-side, but if you can get away with just one version I'd always recommend doing so - it makes maintenance so much easier.

As your machine only has 4GB of memory anyway, there's probably not that much to be gained from using the 64 bit version of R. The 64 bit version of Windows will make a big difference, but if you go to the 64 bit version of R as well, and actually use 4GB of memory then you are likely to see lots of disk thrashing and your calculations will take forever.

Finally, when faced with memory resource shortages it is often possible to find alternative ways to organise your code so that you simply avoid the issue altogether. Since we don't have the details of your R code, we can't tell if that is the case here, but it may be worth thinking about.

Out of memory on R using linux but not on Windows

With the help of a member from another forum (https://community.rstudio.com/t/out-of-memory-on-r-using-linux-but-not-on-windows/106549), I found the solution. The crash was a result of memory limitation in the swap partition, as speculated earlier. I increased my swap from 2 Gb to 16 Gb and now R/RStudio is able to complete the whole script. It is a quite demanding task since all of my physical memory is exhausted and nearly 15 Gb of the swap is eaten.



Related Topics



Leave a reply



Submit