Read Values into a Shell Variable from a Pipe

Read values into a shell variable from a pipe

Use

IFS= read var << EOF
$(foo)
EOF

You can trick read into accepting from a pipe like this:

echo "hello world" | { read test; echo test=$test; }

or even write a function like this:

read_from_pipe() { read "$@" <&0; }

But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.

FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.

Why piping input to read only works when fed into while read ... construct?

How to do a loop against stdin and get result stored in a variable

Under bash (and other shell also), when you pipe something to another command via |, you will implicitly create a fork, a subshell that is a child of current session. The subshell can't affect current session's environment.

So this:

TOTAL=0
printf "%s %s\n" 9 4 3 1 77 2 25 12 226 664 |
while read A B;do
((TOTAL+=A-B))
printf "%3d - %3d = %4d -> TOTAL= %4d\n" $A $B $[A-B] $TOTAL
done
echo final total: $TOTAL

won't give expected result! :

  9 -   4 =    5 -> TOTAL=    5
3 - 1 = 2 -> TOTAL= 7
77 - 2 = 75 -> TOTAL= 82
25 - 12 = 13 -> TOTAL= 95
226 - 664 = -438 -> TOTAL= -343
echo final total: $TOTAL
final total: 0

Where computed TOTAL could'nt be reused in main script.

Inverting the fork

By using bash Process Substitution, Here Documents or Here Strings, you could inverse the fork:

Here strings

read A B <<<"first second"
echo $A
first

echo $B
second

Here Documents

while read A B;do
echo $A-$B
C=$A-$B
done << eodoc
first second
third fourth
eodoc
first-second
third-fourth

outside of the loop:

echo : $C
: third-fourth

Here Commands

TOTAL=0
while read A B;do
((TOTAL+=A-B))
printf "%3d - %3d = %4d -> TOTAL= %4d\n" $A $B $[A-B] $TOTAL
done < <(
printf "%s %s\n" 9 4 3 1 77 2 25 12 226 664
)
9 - 4 = 5 -> TOTAL= 5
3 - 1 = 2 -> TOTAL= 7
77 - 2 = 75 -> TOTAL= 82
25 - 12 = 13 -> TOTAL= 95
226 - 664 = -438 -> TOTAL= -343

# and finally out of loop:
echo $TOTAL
-343

Now you could use $TOTAL in your main script.

Piping to a command list

But for working only against stdin, you may create a kind of script into the fork:

printf "%s %s\n" 9 4 3 1 77 2 25 12 226 664 | {
TOTAL=0
while read A B;do
((TOTAL+=A-B))
printf "%3d - %3d = %4d -> TOTAL= %4d\n" $A $B $[A-B] $TOTAL
done
echo "Out of the loop total:" $TOTAL
}

Will give:

  9 -   4 =    5 -> TOTAL=    5
3 - 1 = 2 -> TOTAL= 7
77 - 2 = 75 -> TOTAL= 82
25 - 12 = 13 -> TOTAL= 95
226 - 664 = -438 -> TOTAL= -343
Out of the loop total: -343

Note: $TOTAL could not be used in main script (after last right curly bracket } ).

Using lastpipe bash option

As @CharlesDuffy correctly pointed out, there is a bash option used to change this behaviour. But for this, we have to first disable job control:

shopt -s lastpipe           # Set *lastpipe* option
set +m # Disabling job control
TOTAL=0
printf "%s %s\n" 9 4 3 1 77 2 25 12 226 664 |
while read A B;do
((TOTAL+=A-B))
printf "%3d - %3d = %4d -> TOTAL= %4d\n" $A $B $[A-B] $TOTAL
done

9 - 4 = 5 -> TOTAL= -338
3 - 1 = 2 -> TOTAL= -336
77 - 2 = 75 -> TOTAL= -261
25 - 12 = 13 -> TOTAL= -248
226 - 664 = -438 -> TOTAL= -686

echo final total: $TOTAL
-343

This will work, but I (personally) don't like this because this is not standard and won't help to make script readable. Also disabling job control seem expensive for accessing this behaviour.

Note: Job control is enabled by default only in interactive sessions. So set +m is not required in normal scripts.

So forgotten set +m in a script would create different behaviours if run in a console or if run in a script. This will not going to make this easy to understand or to debug...

Bash read command with cat and pipe

read reads from standard input by default. When you use the pipe, standard input is the pipe, not the terminal.

If you want to always read from the terminal, redirect the read input to /dev/tty.

#!/usr/bin/env bash
set -x
while true; do
read -p "Hello, what's your name? " name </dev/tty
echo $name
done

But you could instead solve the problem by giving the script as an argument to bash instead of piping.

bash ./install.sh

When using curl to get the script, you can use process substitution:

bash <(curl -sl https://www.conteso.com/install.sh)

bash: Assign variable from pipe?

To complement Charles Duffy's helpful answer with a focus on making it work in bash:

By default, and on Bash v4.1- invariably, any variable creations / modifications in a (multi-segment) pipeline happen in a subshell, so that the result will not be visible to the calling shell.

In Bash v4.2+, you can set option lastpipe to make the last pipeline segment run in the current shell, so that variable creations/modifications made there are visible to it.

For that to work in an interactive shell, you must additionally turn off job control with set +m.

Here's a complete example (Bash v4.2+):

$ unset x; shopt -s lastpipe; set +m; seq 3 | x=$(cat); echo "$x"
1
2
3

That said,

x=$(seq 3)

(the modern equivalent of your x=`seq 3`) is much simpler - it is POSIX-compliant and therefore works on older Bash versions too, and it requires no fiddling with global options.

How to read hidden input from terminal and pipe it to another command

Is this what you wanted to achieve ?

$ read -s       # I type `secret`
$ echo $REPLY
secret
$ printf %s $REPLY | wc -c
6
$ unset REPLY
$ echo $REPLY
# empty now

Or you want one-liner like this :

{ read -s -p "Input a secret: "; printf %s $REPLY; } | wc -c

If you define an alias :

alias readp='{ read -s -p "Input a secret: "; printf %s $REPLY; }'

then you can do readp | wc -c

Bash howto write/read to/from a named pipe without aborting after first sending

It's because echo ... > fifo opens and then closes the fifo. As a workaround you can do like this:

# open for writing
exec 20> fifo
echo foo >&20
echo bar >&20
...
# to close it
exec 20>&-

A bit explanation:

  • exec 20> fifo opens fifo for writing with FD (file descriptor) 20.
  • command >&20 redirects the output to FD 20.
  • exec 20>&- closes FD 20.

The following are excerpts from man bash:

  • exec [-cl] [-a name] [command [arguments]]

    [...] If command is not specified, any redirections take effect in the current shell, and the return status is 0. If there is a redirection error, the return status is 1.

  • [n]>word

    Redirection of output causes the file whose name results from the expansion of word to be opened for writing on file descriptor n, or the standard output (file descriptor 1) if n is not specified. If the file does not exist it is created; if it does exist it is truncated to zero size. [...]

  • [n]>&word

    [...] If word evaluates to -, file descriptor n is closed. [...]

Bash - pipe to variable and file

The read has to execute in the current shell; you need to invert your pipeline.

read S < <(echo anything | tee >(gzip - > S.gz))

or, in bash 4.2 or later, use the lastpipe option. (Note that job control must be inactive for lastpipe to take effect. It is off by default in non-interactive shells, and can be turned off in interactive shells with set +m.)

shopt -s lastpipe
echo anything | tee >(gzip - > S.gz) | read S

Pipe input into a script

Commands inherit their standard input from the process that starts them. In your case, your script provides its standard input for each command that it runs. A simple example script:

#!/bin/bash
cat > foo.txt

Piping data into your shell script causes cat to read that data, since cat inherits its standard input from your script.

$ echo "Hello world" | myscript.sh
$ cat foo.txt
Hello world

The read command is provided by the shell for reading text from standard input into a shell variable if you don't have another command to read or process your script's standard input.

#!/bin/bash

read foo
echo "You entered '$foo'"

$ echo bob | myscript.sh
You entered 'bob'


Related Topics



Leave a reply



Submit