How to Pass Command Output as Multiple Arguments to Another Command

How to pass command output as multiple arguments to another command

You can use xargs:

grep 'pattern' input | xargs -I% cp "%" "%.bac"

How can you pass a multi-line output from one command as n arguments to a single command, not n commands with a single argument?

You're trying too hard.

find * -type f -name "*.swift" -print0 | xargs -0 xcrun -sdk macosx swiftc -o "$OUTPUT_FILE" 

Passing output from one command as argument to another

As a sidenote, tail displays the last 10 lines.

A possible solution would be to grepthis way:

for i in `ls -lf access.log*`; do grep $(tail $i |awk {'print $4'} |cut -d: -f 1| sed 's/\[/\\[/') $i > $i.output; done

How to pass output as command line argument in bash?

Use process substitution:

myprogram testfile $(wc -l < testfile) testfile.out
^^^^^^^^^^^^^^^^^^^

This way, wc -l < testfile is evaluated together with the call of the program and you have both commands combined.

Note wc -l < file returns you just the number, so you don't have to do cut or any other thing to clean the output.

How to pass arguments to bash command

You could try xargs:

cat asnlookups_domains_ip.txt | awk '{print $1}' | xargs -n1 amass intel -whois -d

It passes the standard input as arguments after your command.

how do I pass the output of a bash command as parameters to another?

To pass the output of,a program to another you can use xargs, for example

     find . -name "myfile*" -print | xargs grep "myword" $1

This one searches for files called myfile* and look for a key inside of them.

How to pass command with its arguments as a single argument to script?

First point, you don't want to pass the entire command as a single argument: I'm trying to put a command in a variable, but the complex cases always fail!

This is the main error: cmd=$($1) -- the $() syntax invokes Command Substitution that executes the first argument. Just use cmd=$1 to store the first parameter into the "cmd" variable.

Additional errors:

  • if ($status != 0); then -- that syntax is incorrect: use if ((status != 0)); then for proper arithmetic evaluation.
  • $* -- to execute the positional parameters correctly, use "$@" (with the quotes) -- that form will maintain any arguments that contain whitespace as intended.

How to replicate bash's escaping on commands passed as arguments to other commands

I am struggling to find a way to replicate the escaping bash does behind the scenes

Typically use printf "%q" to escape stuff.

printf "%q" "$(kubectl get configmap ....)"

This is printf as the bash builtin command. It differs from coreutils printf, and newest ones also support %q with different quoting style:

/usr/bin/printf "%q" "$(kubectl get configmap ....)"

Modern bash also has quoting expansion:

var="$(kubectl get configmap ....)"
echo "${var@Q}"

And there is also the quoting style outputted by set -x.


I would suggest to use a file:

kubectl get configmap ... > /tmp/tempfile
python parse_ips.py "$(cat /tmp/tempfile)"

With xclip you can copy command output straight to the X server clipboard, which is handy:

printf "%q" "$(kubectl get configmap ...)" | xclip -selection clipboard

# then in another window:
python parse_ips.py <right mouse click><select paste>

Passing multiple arguments from input file, to a command multiple times (Bash)

This sounds like the perfect job for xargs. For your iptables example it works like this:

xargs -L 1 sudo ip6tables < input.txt

xargs reads command arguments from stdin and executes the provided command with the arguments added to the command line. Here the arguments are piped in to stdin from the input.txt file. -L 1 limits the arguments to one line per execution, i.e. one line is added to the command line, the resulting command gets executed, continue with the next line etc.



Related Topics



Leave a reply



Submit