How to Indirectly Assign a Variable in Bash to Take Multi-Line Data from Both Standard In, a File, and The Output of Execution

How do I indirectly assign a variable in bash to take multi-line data from both Standard In, a File, and the output of execution

This is close to optimal -- but drop the eval.

executeToVar() { local varName=$1; shift; printf -v "$1" %s "$("$@")"; }

The one problem this formulation still has is that $() strips trailing newlines. If you want to prevent that, you need to add your own trailing character inside the subshell, and strip it off yourself.

executeToVar() {
local varName=$1; shift;
local val="$(printf %s x; "$@"; printf %s x)"; val=${val#x}
printf -v "$varName" %s "${val%x}"
}

If you want to read all content from stdin into a variable, this is particularly easy:

# This requires bash 4.1 for automatic fd allocation
readToVar() {
if [[ $2 && $2 != "-" ]]; then
exec {read_in_fd}<"$2" # copy from named file
else
exec {read_in_fd}<&0 # copy from stdin
fi
IFS= read -r -d '' "$1" <&$read_in_fd # read from the FD
exec {read_in_fd}<&- # close that FD
}

...used as:

readToVar var < <( : "run something here to read its output byte-for-byte" )

...or...

readToVar var filename

Testing these:

bash3-3.2$ executeToVar var printf '\n\n123\n456\n789\n\n'
bash3-3.2$ declare -p var
declare -- var="

123
456
789

"

...and...

bash4-4.3$ readToVar var2 < <(printf '\n\n123\n456\n789\n\n')
bash4-4.3$ declare -p var2
declare -- var2="

123
456
789

"

How to read variables from file, with multiple variables per line?

I think all you're looking for is to read multiple variables per line: the read command can assign words to variables by itself.

while read -r first second third; do
do_stuff_with "$first"
do_stuff_with "$second"
do_stuff_with "$third"
done < ./test.txt

Bash - Get the VALUE of 'nested' variable into another variable [Edit: Indirect Variable Expansion]

You can use eval, variable indirection ${!...}, or reference variables declare -n.

In the following, I will use lowercase variable names, since uppercase variable names are special by convention. Especially overwriting $USER is bad, because that variable normally contains your user name (without explicitly setting it). For the following code fragments assume the following variables:

user1_dir=./user1/stuff
user=user1

Eval

eval "echo \${${user}_dir}"
# prints `./user1/stuff`

Eval is a bash built-in that executes its arguments as if they were entered in bash itself. Here, eval is called with the argument echo "${user1_dir}".

Using eval is considered bad practice, see this question.

Variable Indirection

When storing the name of variable var1 inside another variable var2, you can use the indirection ${!var2} to get the value of var1.

userdir="${user}_dir"
echo "${!userdir}"
# prints `./user1/stuff`

Reference Variables

Instead of using indirection every time, you also can declare a reference variable in bash:

declare -n myref="${user}_dir"

The reference can be used similar to variable indirection, but without having to write the !.

echo "$myref"
# prints `./user1/stuff`

Alternatives

Your script may become easier when using (associative) arrays. Arrays are variables that store multiple values. Single values can be accessed by using an index. Normal arrays use natural numbers as indices. Associative arrays use arbitrary strings as indices.

(Normal) Arrays

# Create an array with three entries
myarray=(./user1/stuff ./user2/stuff ./user3/stuff)

# Get the first entry
echo "${myarray[0]}"

# Get the *n*-th entry
n=2
echo "${myarray[$n]}"

Associative Arrays

Declare an associative array with three entries

# Create an associative array with three entries
declare -A myarray
myarray[user1]=./user1/stuff
myarray[user2]=./user2/stuff
myarray[user3]=./user3/stuff

# Get a fixed entry
echo "${myarray[user1]}"

# Get a variable entry
user=user1
echo "${myarray[$user]}"

Dynamic variable names in Bash

Use an associative array, with command names as keys.

# Requires bash 4, though
declare -A magic_variable=()

function grep_search() {
magic_variable[$1]=$( ls | tail -1 )
echo ${magic_variable[$1]}
}

If you can't use associative arrays (e.g., you must support bash 3), you can use declare to create dynamic variable names:

declare "magic_variable_$1=$(ls | tail -1)"

and use indirect parameter expansion to access the value.

var="magic_variable_$1"
echo "${!var}"

See BashFAQ: Indirection - Evaluating indirect/reference variables.

Build a string in Bash with newlines

Using ANSI C quoting:

var="$var"$'\n'"in a box"

You could put the $'\n' in a variable:

newline=$'\n'
var="$var${newline}in a box"

By the way, in this case, it's better to use the concatenation operator:

var+="${newline}in a box"

If you don't like ANSI C quoting, you can use printf with its -v option:

printf -v var '%s\n%s' "$var" "in a box"

Then, to print the content of the variable var, don't forget quotes!

echo "$var"

or, better yet,

printf '%s\n' "$var"

Remark. Don't use upper case variable names in Bash. It's terrible, and one day it will clash with an already existing variable!


You could also make a function to append a newline and a string to a variable using indirect expansion (have a look in the Shell Parameter Expansion section of the manual) as so:

append_with_newline() { printf -v "$1" '%s\n%s' "${!1}" "$2"; }

Then:

$ var="The "
$ var+="cat wears a mask"
$ append_with_newline var "in a box"
$ printf '%s\n' "$var"
The cat wears a mask
in a box
$ # There's no cheating. Look at the content of 'var':
$ declare -p var
declare -- var="The cat wears a mask
in a box"

Just for fun, here's a generalized version of the append_with_newline function that takes n+1 arguments (n≥1) and that will concatenate them all (with exception of the first one being the name of a variable that will be expanded) using a newline as separator, and puts the answer in the variable, the name of which is given in the first argument:

concatenate_with_newlines() { local IFS=$'\n'; printf -v "$1" '%s\n%s' "${!1}" "${*:2}"; }

Look how well it works:

$ var="hello"
$ concatenate_with_newlines var "a gorilla" "a banana" "and foobar"
$ printf '%s\n' "$var"
hello
a gorilla
a banana
and foobar
$ # :)

It's a funny trickery with IFS and "$*".

Bash: Capture output of command run in background

Bash has indeed a feature called Process Substitution to accomplish this.

$ echo <(yes)
/dev/fd/63

Here, the expression <(yes) is replaced with a pathname of a (pseudo device) file that is connected to the standard output of an asynchronous job yes (which prints the string y in an endless loop).

Now let's try to read from it:

$ cat /dev/fd/63
cat: /dev/fd/63: No such file or directory

The problem here is that the yes process terminated in the meantime because it received a SIGPIPE (it had no readers on stdout).

The solution is the following construct

$ exec 3< <(yes)  # Save stdout of the 'yes' job as (input) fd 3.

This opens the file as input fd 3 before the background job is started.

You can now read from the background job whenever you prefer. For a stupid example

$ for i in 1 2 3; do read <&3 line; echo "$line"; done
y
y
y

Note that this has slightly different semantics than having the background job write to a drive backed file: the background job will be blocked when the buffer is full (you empty the buffer by reading from the fd). By contrast, writing to a drive-backed file is only blocking when the hard drive doesn't respond.

Process substitution is not a POSIX sh feature.

Here's a quick hack to give an asynchronous job drive backing (almost) without assigning a filename to it:

$ yes > backingfile &  # Start job in background writing to a new file. Do also look at `mktemp(3)` and the `sh` option `set -o noclobber`
$ exec 3< backingfile # open the file for reading in the current shell, as fd 3
$ rm backingfile # remove the file. It will disappear from the filesystem, but there is still a reader and a writer attached to it which both can use it.

$ for i in 1 2 3; do read <&3 line; echo "$line"; done
y
y
y

Linux also recently got added the O_TEMPFILE option, which makes such hacks possible without the file ever being visible at all. I don't know if bash already supports it.

UPDATE:

@rthur, if you want to capture the whole output from fd 3, then use

output=$(cat <&3)

But note that you can't capture binary data in general: It's only a defined operation if the output is text in the POSIX sense. The implementations I know simply filter out all NUL bytes. Furthermore POSIX specifies that all trailing newlines must be removed.

(Please note also that capturing the output will result in OOM if the writer never stops (yes never stops). But naturally that problem holds even for read if the line separator is never written additionally)

Redirect all output to file in Bash

That part is written to stderr, use 2> to redirect it. For example:

foo > stdout.txt 2> stderr.txt

or if you want in same file:

foo > allout.txt 2>&1

Note: this works in (ba)sh, check your shell for proper syntax

How to return a string value from a Bash function

There is no better way I know of. Bash knows only status codes (integers) and strings written to the stdout.



Related Topics



Leave a reply



Submit