Pipe Bash command output to stdout and to a variable
Copying To A TTY (Not Stdout!)
Pipeline components run in subshells, so even if they do assign shell variables (and the syntax for that was wrong), those shell variables are unset as soon as the pipeline exits (since the subshells only live as long as the pipeline does).
Thus, you need to capture the output of the entire pipeline into your variable:
var=$(find "$filename" -perm "$i" | tee /dev/tty | wc -l)
Personally, btw, I'd be tee
ing to /dev/stderr
or /dev/fd/2
to avoid making behavior dependent on whether a TTY is available.
Actually Piping To Stdout
With bash 4.1, automatic file descriptor allocation lets you do the following:
exec {stdout_copy}>&1 # make the FD named in "$stdout_copy" a copy of FD 1
# tee over to "/dev/fd/$stdout_copy"
var=$(find "$filename" -perm "$i" | tee /dev/fd/"$stdout_copy" | wc -l)
exec {stdout_copy}>&- # close that copy previously created
echo "Captured value of var: $var"
With an older version of bash, you'd need to allocate a FD yourself -- in the below example, I'm choosing file descriptor number 3 (as 0, 1 and 2 are reserved for stdin, stdout and stderr, respectively):
exec 3>&1 # make copy of stdout
# tee to that copy with FD 1 going to wc in the pipe
var=$(find "$filename" -perm "$i" | tee /dev/fd/3 | wc -l)
exec 3>&- # close copy of stdout
pipe output to stdout and then to command then to variable
Your code has a lot of dependencies. I will illustrate what I think that you need without using anything beyond standard unix tools.
This runs a command, seq 4
, and sends all of its output to stdout and also sends all of its output to another command, sed 's/3/3-processed/'
, the output of which is captured in a variable, var
:
$ exec 3>&1
$ var=$(seq 4 | tee >(cat >&3) | sed 's/3/3-processed/')
1
2
3
4
To illustrate that we successfully captured the output of the sed
command:
$ echo "$var"
1
2
3-processed
4
Explanation: var=$(...)
captures the output of file handle 1 (stdout) and assigns it to var
. Thus, to make the output also appear on stdout, we need to duplicate stdout to another file handle before $(...)
redirects it. Thus, we use exec
to duplicate stdout as file handle 3. In this way, tee >(cat >&3)
sends the output of the command both the original stdout (now called 3
) and to file handle 1
which is passed on the the next stage in the pipeline.
So, using your toolchain, try:
exec 5>&1
dsym=$(xcodebuild -scheme "<myscheme>" archive | tee >(cat >&5) | php -r "$code")
How can I pipe output, from a command in an if statement, to a function?
The Print
function doesn't read standard input so there's no point piping data to it. One possible way to do what you want with the current implementation of Print
is:
if ! occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1); then
Print "Error: Failed to scan files. Are you in maintenance mode?"
fi
Print "'occ' output: $occ_output"
Since there is only one line in the body of the if
statement you could use ||
instead:
occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1) \
|| Print "Error: Failed to scan files. Are you in maintenance mode?"
Print "'occ' output: $occ_output"
The 2>&1
causes both standard output and error output of occ
to be captured to occ_output
.
Note that the body of the Print
function could be simplified to:
[[ $quiet_mode == No ]] && printf '%s\n' "$1"
(( logging )) && printf '%s\n' "$1" >> "$log_file"
See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I replaced echo "$1"
with printf '%s\n' "$1"
.
How to output bash command to stdout and pipe to another command at the same time?
Use tee
:
ps -up `nvidia-smi |tee /dev/stderr |tail -n +16 | head -n -1 | sed 's/\s\s*/ /g' | cut -d' ' -f3`
Since stdout is piped, you can't make a copy to it, so I picked stderr to show output.
If /dev/stderr
is not available, use /proc/self/fd/2
.
Capture stdout to variable and get the exit statuses of foreground pipe
You can:
Use a temporary file to pass PIPESTATUS.
tmp=$(mktemp)
out=$(pipeline; echo "${PIPESTATUS[@]}" > "$tmp")
PIPESTATUS=($(<"$tmp")) # Note: PIPESTATUS is overwritten each command...
rm "$tmp"Use a temporary file to pass
out
.tmp=$(mktemp)
pipeline > "$tmp"
out=$(<"$tmp"))
rm "$tmp"Interleave output with pipestatus. For example reserve the part from last newline character till the end for PIPESTATUS. To preserve original return status I think some temporary variables are needed:
out=$(pipeline; tmp=("${PIPESTATUS[@]}") ret=$?; echo $'\n' "${tmp[@]}"; exit "$ret"))
pipestatus=(${out##*$'\n'})
out="${out%$'\n'*}"
out="${out%%$'\n'}" # remove trailing newlines like command substitution doestested with:
out=$(false | true | false | echo 123; echo $'\n' "${PIPESTATUS[@]}");
pipestatus=(${out##*$'\n'});
out="${out%$'\n'*}"; out="${out%%$'\n'}";
echo out="$out" PIPESTATUS="${pipestatus[@]}"
# out=123 PIPESTATUS=1 0 1 0
Notes:
- Upper case variables by convention should be reserved by exported variables.
How do I redirect output to a variable in shell?
Use the $( ... )
construct:
hash=$(genhash --use-ssl -s $IP -p 443 --url $URL | grep MD5 | grep -c $MD5)
How to capture the output of a bash command into a variable when using pipes and apostrophe?
To capture the output of a command in shell, use command substitution: $(...)
. Thus:
pid=$(ps -ef | grep -v color=auto | grep raspivid | awk '{print $2}')
Notes
When making an assignment in shell, there must be no spaces around the equal sign.
When defining shell variables for local use, it is best practice to use lower case or mixed case. Variables that are important to the system are defined in upper case and you don't want to accidentally overwrite one of them.
Simplification
If the goal is to get the PID of the raspivid
process, then the grep
and awk
can be combined into a single process:
pid=$(ps -ef | awk '/[r]aspivid/{print $2}')
Note the simple trick that excludes the current process from the output: instead of searching for raspivid
we search for [r]aspivid
. The string [r]aspivid
does not match the regular expression [r]aspivid
. Hence the current process is removed from the output.
The Flexibility of awk
For the purpose of showing how awk
can replace multiple calls to grep
, consider this scenario: suppose that we want to find lines that contain raspivid
but that do not contain color=auto
. With awk
, both conditions can be combined logically:
pid=$(ps -ef | awk '/raspivid/ && !/color=auto/{print $2}')
Here, /raspivid/
requires a match with raspivid
. The &&
symbol means logical "and". The !
before the regex /color=auto/
means logical "not". Thus, /raspivid/ && !/color=auto/
matches only on lines that contain raspivid
but not color=auto
.
How to assign a piped output as a variable while pipe is continuous
The problem is that you only run read
(and then updates the status) once, so it reads a single line (and updates the status once). You need a loop, so it'll repeat the read
+update process over & over. You can use a while
loop to do this. If it should exit when there's no more input to process, make read
the while
condition:
aria2c $url --summary-interval=5 2>&1 |
tee output.log |
grep -oP "(\d+(\.\d+)?(?=%))" |
while read text; do
curl -s "https://api.legram.org/bot${tg_token}/editMessageText" --data "message_id=${msg_id}&text=DOWNLOADED-${text}&chat_id=${ch_id}&parse_mode=HTML&disable_web_page_preview=True"
done
Bash: Pipe stdout and catch stderr into a variable
You could do something like:
errors=$(mysqldump database 2>&1 > >(gzip > database.sql))
Here, I'm using process substitution to get gzip
to use mysqldump
's output as stdin. Given the order of redirections (2>&1
before >
), mysqldump
's stderr should now be used for the command substitution.
Testing it out:
$ a=$(sh -c 'echo foo >&2; echo bar' 2>&1 > >(gzip > foo))
$ gunzip < foo
bar
$ echo $a
foo
Related Topics
Different Results Between Ps Aux and 'Ps Aux' Inside a Script
Cmake Auto Header File Dependency
Bash Echo with an $ Character Outside the String
Trying to Ping Linux Vm Hosted on Azure Does Not Work
How to Check in Bash Whether a File Was Created More Than X Time Ago
How Is It Possible That Kill -9 for a Process on Linux Has No Effect
Logo Programming Language Implementations
How to Change the Monitor Brightness on Linux
Will Adding the -Rdynamic Linker Option to Gcc/G++ Impact Performance
Bash, Linux, Need to Remove Lines from One File Based on Matching Content from Another File
Explicitly Invoke Sig_Dfl/Sig_Ign Handlers on Linux
Unix - Count of Columns in File
Linux/Bash, Using Ps -O to Get Process by Specific Name