Setting Environment Variable to a Large Value -> "Argument List Too Long"

Setting environment variable to a large value - Argument list too long

Command-line arguments and environment variables both come out of the same pool of space. Set environment variables too long, and you no longer have space for command-line arguments -- and even xargs, which breaks command line invocations down into smaller groupings to fit inside the pool where possible, can't operate when that pool is completely full.

So: Don't do that. For instance, you might store your data in a file, and export the path to that file in the environment.


By the way -- the reason echo works is that it's built into your shell. Thus,

echo "$LG"

...doesn't need to start an external process, so the limits on argument list length and environment size at process startup time don't apply.

On the other hand, if you ran

/bin/echo "$LG"

...then you'd see the problem again.


Given the explanation edited into the question as to what you're actually trying to accomplish, let me suggest an approach which requires neither environment space nor command-line space:

#!/bin/bash
# ^-- also consider ksh; faster than bash, but also supports <()
# /bin/sh is not usable here, as POSIX sh does not specify <().

lg=... ## DO NOT USE export HERE!
sed -f <(printf '%s\n' "s/A/$lg/g")

Does argument list too long restriction apply to shell builtins?

In bash, the OS-enforced limitation on command-line length which causes the error argument list too long is not applied to shell builtins.

This error is triggered when the execve() syscall returns the error code E2BIG. There is no execve() call involved when invoking a builtin, so the error cannot take place.

Thus, both of your proposed operations are safe: cmd <<< "$string" writes $string to a temporary file, which does not require that it be passed as an argv element (or an environment variable, which is stored in the same pool of reserved space); and printf '%s\n' "$cmd" takes place internal to the shell unless the shell's configuration has been modified, as with enable -n printf, to use an external printf implementation.

Argument list too long error when I run any command in Linux shell

Thank you all. I use set, which is suggested by chepner, and see what happened. The problem is caused by my environment PATH. PATH is too long. After I clear PATH and set it again. It works.
Thank you

Bashshell - argument too long message shows after for loop-echo command

Your code is setting too many (or too-large) environment variables.

Environment variables live in the same per-process space as command line arguments, so exporting variables to the environment reduces the space available for command-line arguments. (If you aren't explicitly exporting content inside your loop, check whether echo $- inside your script contains a, a flag which tells the shell to automatically export every variable set; if it does, find the place where you're turning that option on, and disable it).

To demonstrate this (with GNU xargs), you can run:

xargs --show-limits </dev/null

Output should look something like:

Your environment variables take up 2615 bytes
POSIX upper limit on argument length (this system): 257481
POSIX smallest allowable upper limit on argument length (all systems): 4096
Maximum length of command we could actually use: 254866
Size of command buffer we are actually using: 131072
Maximum parallelism (--max-procs must be no greater): 2147483647

Subtract the number on the second line from the number on the first line, and that's your maximum command-line length (roughly). Number on the first line too big? Look through your environment, and either unset large variables, or don't export them in the first place.



Related Topics



Leave a reply



Submit