Bash (Or Other Shell): Wrap All Commands with Function/Script

Bash (or other shell): wrap all commands with function/script

Run this bash script:

#!/bin/bash
while read -e line
do
wrapper "$line"
done

In its simplest form, wrapper could consist of eval "$LINE". You mentioned wanting to have timings, so maybe instead have time eval "$line". You wanted to capture exit status, so this should be followed by the line save=$?. And, you wanted to capture the first few lines of stdout, so some redirecting is in order. And so on.

MORE: Jo So suggests that handling for multiple-line bash commands be included. In its simplest form, if eval returns with "syntax error: unexpected end of file", then you want to prompt for another line of input before proceeding. Better yet, to check for proper bash commands, run bash -n <<<"$line" before you do the eval. If bash -n reports the end-of-line error, then prompt for more input to add to `$line'. And so on.

Propagate all arguments in a Bash shell script

Use "$@" instead of plain $@ if you actually wish your parameters to be passed the same.

Observe:

$ cat no_quotes.sh
#!/bin/bash
echo_args.sh $@

$ cat quotes.sh
#!/bin/bash
echo_args.sh "$@"

$ cat echo_args.sh
#!/bin/bash
echo Received: $1
echo Received: $2
echo Received: $3
echo Received: $4

$ ./no_quotes.sh first second
Received: first
Received: second
Received:
Received:

$ ./no_quotes.sh "one quoted arg"
Received: one
Received: quoted
Received: arg
Received:

$ ./quotes.sh first second
Received: first
Received: second
Received:
Received:

$ ./quotes.sh "one quoted arg"
Received: one quoted arg
Received:
Received:
Received:

How to write bash function to print and run command when the command has arguments with spaces or things to be expanded

If you've got Bash version 4.4 or later, this function may do what you want:

function print_and_run_cmd
{
local PS4='Running cmd: '
local -
set -o xtrace

"$@"
}

For example, running

print_and_run_cmd echo 'Hello World!'

outputs

Running cmd: echo 'Hello World!'
Hello World!
  • local PS4='Running cmd: ' sets a prefix for commands printed by the shell when the xtrace option is on. The default is + . Localizing it means that the previous value of PS4 is automatically restored when the function returns.

  • local - causes any changes to shell options to be reverted automatically when the function returns. In particular, it causes the set -o xtrace on the next line to be automatically undone when the function returns. Support for local - was added in Bash 4.4.

    From man bash, under the local [option] [name[=value] ... | - ] section (emphasis added):

    If name is -, the set of shell options is made local to the function in which local is invoked: shell options changed using the set builtin inside the function are restored to their original values when the function returns.

  • set -o xtrace (which is equivalent to set -x) causes the shell to print commands, preceded by the expanded value of PS4, before running them.

    See help set.

How to pass all arguments passed to my Bash script to a function of mine?

The $@ variable expands to all command-line parameters separated by spaces. Here is an example.

abc "$@"

When using $@, you should (almost) always put it in double-quotes to avoid misparsing of arguments containing spaces or wildcards (see below). This works for multiple arguments. It is also portable to all POSIX-compliant shells.

It is also worth noting that $0 (generally the script's name or path) is not in $@.

The Bash Reference Manual Special Parameters Section says that $@ expands to the positional parameters starting from one. When the expansion occurs within double quotes, each parameter expands to a separate word. That is "$@" is equivalent to "$1" "$2" "$3"....

Passing some arguments:

If you want to pass all but the first arguments, you can first use shift to "consume" the first argument and then pass "$@" to pass the remaining arguments to another command. In Bash (and zsh and ksh, but not in plain POSIX shells like dash), you can do this without messing with the argument list using a variant of array slicing: "${@:3}" will get you the arguments starting with "$3". "${@:3:4}" will get you up to four arguments starting at "$3" (i.e. "$3" "$4" "$5" "$6"), if that many arguments were passed.

Things you probably don't want to do:

"$*" gives all of the arguments stuck together into a single string (separated by spaces, or whatever the first character of $IFS is). This looses the distinction between spaces within arguments and the spaces between arguments, so is generally a bad idea. Although it might be ok for printing the arguments, e.g. echo "$*", provided you don't care about preserving the space within/between distinction.

Assigning the arguments to a regular variable (as in args="$@") mashes all the arguments together like "$*" does. If you want to store the arguments in a variable, use an array with args=("$@") (the parentheses make it an array), and then reference them as e.g. "${args[0]}" etc. Note that in Bash and ksh, array indexes start at 0, so $1 will be in args[0], etc. zsh, on the other hand, starts array indexes at 1, so $1 will be in args[1]. And more basic shells like dash don't have arrays at all.

Leaving off the double-quotes, with either $@ or $*, will try to split each argument up into separate words (based on whitespace or whatever's in $IFS), and also try to expand anything that looks like a filename wildcard into a list of matching filenames. This can have really weird effects, and should almost always be avoided. (Except in zsh, where this expansion doesn't take place by default.)

How do I wrap a long sed command embedded within a bash/shell script across multiple lines?

Can you put that specific string in a variable like:

SED_PATTERN="s|^.\+/\([A-Z]\+\)/\(20..\)/\([0-9]\+\)-\([0-9]"
SED_PATTERN+="\+\)-\([0-9]\+\)\.bfc\ \([0-9]\+\)"
SED_PATTERN+="|$i,\1,\2-\3-\4,\2,\3,\4,\5|"

for i in a b {d..z}
do
find /cygdrive/$i -type f -name "*.ext" -printf "%p %k\n" |
sed -e "$SED_PATTERN"
done >> $output

I think that would be the cleanest way.

Wrap all commands entered within a Bash-Shell with a Python script

The perfect way to wrap every command that is typed into a Bash Shell is to change the variable PROMPT_COMMAND inside the .bashrc. For example, if I want to do some Python stuff before every command, liked asked in my question:

.bashrc:

# ...
PROMPT_COMMAND="python mycoolscript.py; $PROMPT_COMMAND;"
export $PROMPT_COMMAND
# ...

now before every command the script mycoolscript.py is run.

Passing bash commands with parameters to a function

It fails because your function expects multiple command arguments (like sudo, env, xargs, find, etc), but you instead pass it a single string with a shell command (as if for su, ssh, sh -c).

You have your function expect a shell command instead:

function RUN()
{
if $debug; then
echo "Executing: $1" >&2
# output to screen
eval "$1"
else
# Suppress screen output, but capture all output in logs
eval "$1" &>> $logs
fi
}

You will have to correctly escape all commands:

# Fails because $2 and $1 are not being escaped:
RUN "env | awk -F= '{ print $2, $1 }'"

# Succeeds because command is correctly escaped:
RUN "env | awk -F= '{ print \$2, \$1 }'"

A different option to consider is just writing your entire script in debug mode, and then optionally squash output, so that you don't have to add any additional escaping:

# Save stdout&stderr in case we need it
exec 5>&1 6>&2

# Optionally squash output
if ! "${debug:false}"
then
exec &>> "$logs"
fi

# This only shows output if debug is on
BINARY PARAM1 PARAM2 -PARAM3 | awk '{ print $2,$1 }' | xargs -n1 BINARY PARAM1 PARAM4

# This always writes to stderr, whether debug is enabled or not
echo "Done" >&6


Related Topics



Leave a reply



Submit