R on MacOS Error: vector memory exhausted (limit reached?)
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
, as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~
touch .Renviron
open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
Error: vector memory exhausted (limit reached?) R 3.5.0 macOS
For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000)
only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:
Error: vector memory exhausted (limit reached?)
After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:
Step 1: Open terminal,
Step 2:
cd ~
touch .Renviron
open .Renviron
Step 3: Save the following as the first line of .Renviron
:
R_MAX_VSIZE=100Gb
Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine
Error: vector memory exhausted (limit reached?)
This change was necessary to deal with operating system memory over-commit issues on Mac OS. From the NEWS file:
\item The environment variable \env{R_MAX_VSIZE} can now be used
to specify the maximal vector heap size. On macOS, unless specified
by this environment variable, the maximal vector heap size is set to
the maximum of 16GB and the available physical memory. This is to
avoid having the \command{R} process killed when macOS over-commits
memory.
Set the environment variable R_MAX_VSIZE
to an appropriate value for your system before starting R and you should be able to read your file.
R vector memory exhausted for calculating tensor products
I'm guessing you intending to collapse the 2nd and 3rd dimensions in that tensor product. Perhaps you want something like this:
library(tensor)
trainxtensor_a <- tensor(trainxr, trainxg, c(2,3), c(2,3))
Although you should try a smaller dataset to check if it is doing what you expect first:
trainxtensor_a <- tensor(trainxr[1:5,,], trainxg[1:5,,], c(2,3), c(2,3))
Related Topics
What Are the "Standard Unambiguous Date" Formats For String-To-Date Conversion in R
Scale a Series Between Two Points
Assign Multiple New Variables on Lhs in a Single Line
Unordered Combinations of All Lengths
How to Save Plots That Are Made in a Shiny App
Remove Na Values from a Vector
Check If the Number Is Integer
Ggplot2: Histogram With Normal Curve
Select Multiple Columns in Data.Table by Their Numeric Indices
How to Convert Posix Date to Day of Year
Ggplot2 Geom_Bar - How to Keep Order of Data.Frame
Replace/Translate Characters in a String
Convert Comma Separated String to Numeric Columns
Programming With Dplyr Using String as Input
Adding a New Column to Each Element in a List of Tables or Data Frames
How to Center Stacked Percent Barchart Labels
Compute Mean and Standard Deviation by Group For Multiple Variables in a Data.Frame