R on MacOS Error: vector memory exhausted (limit reached?)
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
, as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~
touch .Renviron
open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
Error: vector memory exhausted (limit reached?) R 3.5.0 macOS
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~
touch .Renviron
open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
Error: vector memory exhausted (limit reached?)
This change was necessary to deal with operating system memory over-commit issues on Mac OS. From the NEWS file:
\item The environment variable \env{R_MAX_VSIZE} can now be used
to specify the maximal vector heap size. On macOS, unless specified
by this environment variable, the maximal vector heap size is set to
the maximum of 16GB and the available physical memory. This is to
avoid having the \command{R} process killed when macOS over-commits
memory.
Set the environment variable R_MAX_VSIZE
to an appropriate value for your system before starting R and you should be able to read your file.
R vector memory exhausted for calculating tensor products
I'm guessing you intending to collapse the 2nd and 3rd dimensions in that tensor product. Perhaps you want something like this:
library(tensor)
trainxtensor_a <- tensor(trainxr, trainxg, c(2,3), c(2,3))
Although you should try a smaller dataset to check if it is doing what you expect first:
trainxtensor_a <- tensor(trainxr[1:5,,], trainxg[1:5,,], c(2,3), c(2,3))
Related Topics
How to Use Cast or Another Function to Create a Binary Table in R
How to Convert Dd/Mm/Yy to Yyyy-Mm-Dd in R
Replicate Each Row of Data.Frame and Specify the Number of Replications for Each Row
R::Ggplot2::Geom_Points: How to Swap Points with Pie Charts
Convert Integer as "20160119" to Different Columns of "Day" "Year" "Month"
Failure to Connect to Odbc Database in R
Remove Space Between Bars Ggplot2
Remove Duplicates Based on 2Nd Column Condition
How to Suppress the Creation of a Plot While Calling a Function in R
How to Get a Second Bibliography
Split a Vector by Its Sequences
Matching a Sequence in a Larger Vector
Strange Formatting of Legend in Ggplotly in R
How to Group a Vector into a List of Vectors
Does the Term "Vectorization" Mean Different Things in Different Contexts