Pipe output to bash function
To answer your actual question, when a shell function is on the receiving end of a pipe, standard input is inherited by all commands in the function, but only commands that actually read form their standard input consume any data. For commands that run one after the other, later commands can only see what isn't consumed by previous commands. When two commands run in parallel, which commands see which data depends on how the OS schedules the commands.
Since printf
is the first and only command in your function, standard input is effectively ignored. There are several ways around that, including using the read
built-in to read standard input into a variable which can be passed to printf
:
jc_hms () {
read foo
hr=$(($foo / 3600))
min=$(($foo / 60))
sec=$(($foo % 60))
printf "%d:%02d:%02d" "$hr" "$min" "$sec"
}
However, since your need for a pipeline seems to depend on your perceived need to use awk
, let me suggest the following alternative:
printstring=$( jc_hms $songtime )
Since songtime
consists of a space-separated pair of numbers, the shell performs word-splitting on the value of songtime
, and jc_hms
sees two separate parameters. This requires no change in the definition of jc_hms
, and no need to pipe anything into it via standard input.
If you still have a different reason for jc_hms
to read standard input, please let us know.
Bash | pipe to bash function
If bash
is responsible for the core dump, that certainly indicates a bug in bash
that should be reported. However, your function can be written more simply as
example () {
local S
if (( "$#" == 0 )); then
IFS= read -r S
set -- "$S"
fi
echo "${1// /_}"
}
which may, at least, avoid the bug.
How can I pipe output, from a command in an if statement, to a function?
The Print
function doesn't read standard input so there's no point piping data to it. One possible way to do what you want with the current implementation of Print
is:
if ! occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1); then
Print "Error: Failed to scan files. Are you in maintenance mode?"
fi
Print "'occ' output: $occ_output"
Since there is only one line in the body of the if
statement you could use ||
instead:
occ_output=$(sudo -u "$web_user" "$nextcloud_dir/occ" files:scan --all 2>&1) \
|| Print "Error: Failed to scan files. Are you in maintenance mode?"
Print "'occ' output: $occ_output"
The 2>&1
causes both standard output and error output of occ
to be captured to occ_output
.
Note that the body of the Print
function could be simplified to:
[[ $quiet_mode == No ]] && printf '%s\n' "$1"
(( logging )) && printf '%s\n' "$1" >> "$log_file"
See the accepted, and excellent, answer to Why is printf better than echo? for an explanation of why I replaced echo "$1"
with printf '%s\n' "$1"
.
Passing a 'command with a pipe' as a bash function parameter
It's difficult to do what you want in a safe and robust way. Greg's Bash Wiki says it best:
Variables hold data. Functions hold code. Don't put code inside variables!
I wouldn't have it execute the command directly. Instead, have the function return success/failure. Then you can do whatever you like at the calling end with &&
or if
.
is_user_input_enabled() {
[[ $current_user_input = yes ]]
}
is_user_input_enabled && brew update
is_user_input_enabled && curl -L https://get.rvm.io | bash -s stable
Notice how you don't need any extra quotes or escaping to make this work.
See also:
- I'm trying to put a command in a variable, but the complex cases always fail!
- How can we run a command stored in a variable?
- Why should eval be avoided in Bash, and what should I use instead?
Can I make a shell function in as a pipeline conditionally disappear, without using cat?
Yes, you can do this -- by making your function a wrapper that conditionally injects a pipeline element, instead of being an unconditional pipeline element itself. For example:
maybe_checked() {
if [[ $CHECK_OUTPUT != "--check" ]]; then
"$@" # just run our arguments as a command, as if we weren't here
else
# run our arguments in a process substitution, reading from stdout of same.
# ...some changes from the original code:
# IFS= stops leading or trailing whitespace from being stripped
# read -r prevents backslashes from being processed
local line # avoid modifying $line outside our function
while IFS= read -r line; do
[[ -e "/$line" ]] || { echo "Error: /$line does not exist" >&2; return 1; }
printf '%s\n' "$line" # see https://unix.stackexchange.com/questions/65803
done < <("$@")
fi
}
ls /usr | maybe_checked grep '^b'
Caveat of the above code: if the pipefail
option is set, you'll want to check the exit status of the process substitution to have complete parity with the behavior that would otherwise be the case. In bash version 4.3 or later (IIRC), $?
is modified by process substitutions to have the relevant PID, which can be wait
ed for to retrieve exit status.
That said, this is also a use case wherein using cat
is acceptable, and I'm saying this as a card-carying member of the UUOC crowd. :)
Adopting the examples from John Kugelman's answers on the linked question:
maybe_sort() {
if (( sort )); then
"$@" | sort
else
"$@"
fi
}
maybe_limit() {
if [[ -n $limit ]]; then
"$@" | head -n "$limit"
else
"$@"
fi
}
printf '%s\n' "${haikus[@]}" | maybe_limit maybe_sort sed -e 's/^[ \t]*//'
Piping output of bash function
The reason this isn't working is that in
echo $files;
the variable $files
is subject to shell expansion (i.e., it is expanded into individual arguments to echo
), and the resulting tokens are printed by echo
delimited by spaces. This means that the output of it is a single line, and grep
handles it accordingly.
The least invasive fix is to use
echo "$files";
Bash redirect stdout to function
You are writing to a file named "log_stream". The shell is not looking for a program/function name there. Try this:
bgcommand --a --b 2>&1 | log_stream &
#......................^
Or, redirect to a process substitution
bgcommand --a --b 2>&1 > >(log_stream) &
You might want to look into a utility named ts
whose purpose is to add timestamps to it's input.
How to make a bash function which can read from standard input?
You can use <<<
to get this behaviour. read <<< echo "text"
should make it.
Test with readly
(I prefer not using reserved words):
function readly()
{
echo $*
echo "this was a test"
}
$ readly <<< echo "hello"
hello
this was a test
With pipes, based on this answer to "Bash script, read values from stdin pipe":
$ echo "hello bye" | { read a; echo $a; echo "this was a test"; }
hello bye
this was a test
Using a pipe inside a bash function
You're missing a semi-colon in the first attempt.
mdf() { mdfind -name "$1" | grep -Ev 'Library|VMWare|symf|larav' | sort; }
Just a quirk of shell syntax that you need it there. If you put the command on its own line then you don't need one.
mdf() {
mdfind -name "$1" | grep -Ev 'Library|VMWare|symf|larav' | sort
}
(I've removed the function
keyword. For compatibility's sake you should write either func()
or function func
but not combine them.)
Give shellcheck.net a try the next time you're stuck. It's a syntax checker for shell scripts. A real godsend.
Related Topics
How to Upgrade Glibc from Version 2.12 to 2.14 on Centos
What Does Localhost Means Inside a Docker Container
Why Data and Stack Segments Are Executable
How to Kill a Process by Name Instead of Pid, on Linux
How to Set the Gopath Environment Variable on Ubuntu? What File Must I Edit
[ :Unexpected Operator in Shell Programming
Edit Shell Script While It's Running
How to Programmatically Disable Hardware Prefetching
Get the Perl Rename Utility Instead of the Built-In Rename
Minimal Executable Size Now 10X Larger After Linking Than 2 Years Ago, For Tiny Programs
How to Disassemble Raw 16-Bit X86 Machine Code
Does Malloc Lazily Create the Backing Pages For an Allocation on Linux (And Other Platforms)
How to Split CSV Files as Per Number of Rows Specified