Print Stdout/Stderr and Write Them to a File in Bash

How to redirect and append both standard output and standard error to a file with Bash

cmd >>file.txt 2>&1

Bash executes the redirects from left to right as follows:

  1. >>file.txt: Open file.txt in append mode and redirect stdout there.
  2. 2>&1: Redirect stderr to "where stdout is currently going". In this case, that is a file opened in append mode. In other words, the &1 reuses the file descriptor which stdout currently uses.

Print STDOUT/STDERR and write them to a file in Bash?

This will redirect both STDOUT and STDERR to the same file:

some_command 2>&1 | tee file.log

Example

$ touch foo; ls foo asfdsafsadf 2>&1 | tee file.log
ls: asfdsafsadf: No such file or directory
foo
$ cat file.log
ls: asfdsafsadf: No such file or directory
foo

How to redirect output to a file and stdout

The command you want is named tee:

foo | tee output.file

For example, if you only care about stdout:

ls -a | tee output.file

If you want to include stderr, do:

program [arguments...] 2>&1 | tee outfile

2>&1 redirects channel 2 (stderr/standard error) into channel 1 (stdout/standard output), such that both is written as stdout. It is also directed to the given output file as of the tee command.

Furthermore, if you want to append to the log file, use tee -a as:

program [arguments...] 2>&1 | tee -a outfile

How to print shell script stdout/stderr to file/s and console

I think, the easiest is to just add multiple files as arguments to the tee like this:

% python3 -c 'import sys; print("to stdout"); print("to stderr", file=sys.stderr)' 2>&1 | tee -a /tmp/file.txt /tmp/file_sec.txt
to stdout
to stderr
% cat /tmp/file.txt
to stdout
to stderr
% cat /tmp/file_sec.txt
to stdout
to stderr

Your script would look like this then:

#!/bin/bash

file=/tmp/file.txt
sec_file=/tmp/sec_file.txt

exec > >(tee -a "$file" "$sec_file") 2>&1

echo "hello world , we are very happy to stay here "

Output stdout and stderr to file and screen and stderr to file in a limited environment

Finally, reaching the goal. I want to say that I have been inspired by @WilliamPursell's answer.

{ "$0" "${mainArgs[@]}" 2>&1 1>&3 | tee -a "$logPath/$logFileName.err" 1>&3 & } 3>&1 | tee -a "$logPath/$logFileName.log" &

Explanation

  • Relaunch the script with...
  • Sending stderr (2>&1) to stdout and...
  • Sending stdout to a new file descriptor (1>&3)
  • Pipe it to tee which receives stderr to duplicate the errors in a file and to file descriptor #1 with...
  • Sending stdout to the new file descriptor (1>&3)...
  • And having & to ensure no blocking
  • Then grouping the previous commands using curly brackets.
  • Sending the grouped commands new file descriptor to stdout (3>&1)
  • Pipe it to tee which receives stdout that combines errors and normal output that write to file and display on screen
  • And having & to ensure no blocking

Full code of my activateLogs function for those interested. I also included the dependencies even though they could be inserted into the activateLogs function.

m=0
declare -a mainArgs
if [ ! "$#" = "0" ]; then
for arg in "$@"; do
mainArgs[$m]=$arg
m=$(($m + 1))
done
fi

function containsElement()
# $1 string to find
# $2 array to search in
# return 0 if there is a match, otherwise 1
{
local e match="$1"
shift
for e; do [[ "$e" == "$match" ]] && return 0; done
return 1
}

function hasMainArg()
# $1 string to find
# return 0 if there is a match, otherwise 1
{
local match="$1"
containsElement "$1" "${mainArgs[@]}"
return $?
}

function activateLogs()
# $1 = logOutput: What is the output for logs: SCREEN, DISK, BOTH. Default is DISK. Optional parameter.
{
local logOutput=$1
if [ "$logOutput" != "SCREEN" ] && [ "$logOutput" != "BOTH" ]; then
logOutput="DISK"
fi

if [ "$logOutput" = "SCREEN" ]; then
echo "Logs will only be output to screen"
return
fi

hasMainArg "--force-log"
local forceLog=$?

local isFileDescriptor3Exist=$(command 2>/dev/null >&3 && echo "Y")

if [ "$isFileDescriptor3Exist" = "Y" ]; then
echo "Logs are configured"
elif [ "$forceLog" = "1" ] && ([ ! -t 1 ] || [ ! -t 2 ]); then
# Use external file descriptor if they are set except if having "--force-log"
echo "Logs are configured externally"
else
echo "Relaunching with logs files"
local logPath="logs"
if [ ! -d $logPath ]; then mkdir $logPath; fi

local logFileName=$(basename "$0")"."$(date +%Y-%m-%d.%k-%M-%S)

exec 4<> "$logPath/$logFileName.log" # File descriptor created only to get the underlying file in any output option
if [ "$logOutput" = "DISK" ]; then
# FROM: https://stackoverflow.com/a/45426547/214898
exec 3<> "$logPath/$logFileName.log"
"$0" "${mainArgs[@]}" 2>&1 1>&3 | tee -a "$logPath/$logFileName.err" 1>&3 &
else
# FROM: https://stackoverflow.com/a/70790574/214898
{ "$0" "${mainArgs[@]}" 2>&1 1>&3 | tee -a "$logPath/$logFileName.err" 1>&3 & } 3>&1 | tee -a "$logPath/$logFileName.log" &
fi

exit
fi
}

#activateLogs "DISK"
#activateLogs "SCREEN"
activateLogs "BOTH"

echo "FIRST"
echo "ERROR" >&2
echo "LAST"
echo "LAST2"

writing screen output (stdErr) and output (stdOut) of a Linux command to separate files

Sounds like the additional output comes from stderr, which you can capture with 2>:

mycommand > outfile 2> stderr

Redirect to a file STDOUT first and then STDERR

There is no good way* to tell awk or the shell that it must buffer stderr until the tool finishes executing. Keep it simple and just do this:

awk -f script.awk file > out 2>tmp; cat tmp >> out && rm -f tmp

Otherwise you could buffer stderr yourself and print at the end (but this will only work for stderr messages you are manually printing, not messages gawk is generating itself):

{
for (i=0;i<5;i++){
print $0
errs = errs $0 OFS i ORS
}
}
END {
printf "%s", errs > "/dev/stderr"
}

and then call as:

awk -f script.awk file > out 2>&1

Of course you don't actually need to use stderr at all if that's all you're doing with it, just print to stdout.

*There may be some arcane incantation you can use to make this happen if the planets align a certain way and/or you have certain tools or a certain shell but just keep it simple as shown above.

Redirect stderr and stdout in Bash

Take a look here. It should be:

yourcommand &> filename

It redirects both standard output and standard error to file filename.



Related Topics



Leave a reply



Submit