bash - surround all array elements or arguments with quotes
Just like $@
, you need to quote the array expansion.
func () {
argumentsArray=( "$@" )
cp "${argumentsArray[@]}"
}
However, the array serves no purpose here; you can use $@
directly:
func () {
cp "$@"
}
Bash quoted array expansion
Your problem is with echo
. It is getting the correct number of parameters, with some parameters containing spaces, but it's output loses the distinction of spaces between parameters and spaces within parameters.
Instead, you can use printf(1)
to output the parameters and always include quotes, making use of printf's feature that applies the format string successively to parameters when there are more parameters than format specifiers in the format string:
echo "Failed: foo:" $(printf "'%s' " "${mycmd[@]}")
That will put single quotes around each argument, even if it is not needed:
Failed: foo: 'command.ext' 'arg1 with space' 'arg2' 'thing' 'etc'
I've used single quotes to ensure that other shell metacharacters are not mishandled. This will work for all characters except single quote itself - i.e. if you have a parameter containing a single quote, the output from the above command will not cut and paste correctly. This is likely the closest you will get without getting messy.
Edit: Almost 5 years later and since I answered this question, bash 4.4 has been released. This has the "${var@Q}"
expansion which quotes the variable such that it can be parsed back by bash.
This simplifies this answer to:
echo "Failed: foo: " "${mycmd[@]@Q}"
This will correctly handle single quotes in an argument, which my earlier version did not.
How to expand array in Bash adding double quotes to elements?
To preserve the parameters with proper quoting, you have to make two changes: quote the array expansion and use all parameters in the function instead of just $1
.
my_func() {
command par1 par2 "$@"
}
my_arr=("1 a" "2 b" "3 c")
my_func "${my_arr[@]}"
Honoring quotes while reading shell arguments from a file
There's no one good solution here, but you can choose between bad ones.
This answer requires changing the file format:
Using a NUL-delimited stream for the file is the safest approach; literally any C string (thus, any string bash can store as an array element) can be written and read in this manner.
# write file as a NUL-delimited stream
printf '%s\0' abc 'hello world' >junk
# read file as an array
foo=( )
while IFS= read -r -d '' entry; do
foo+=( "$entry" )
done <junk
If valid arguments can't contain newlines, you may wish to leave out the -d ''
on the reading side and change the \0
on the writing side to \n
to use newlines instead of NULs. Note that UNIX filenames can contain newlines, so if your possible arguments include filenames, this approach would be unwise.
This answer almost implements shell-like parsing semantics:
foo=( )
while IFS= read -r -d '' entry; do
foo+=( "$entry" )
done < <(xargs printf '%s\0' <junk)
xargs
has some corner cases surrounding multi-line strings where its parsing isn't quite identical to how a shell does. It's a 99% solution, however.
This answer requires a Python interpreter:
The Python standard library shlex
module supports POSIX-compliant string tokenization which is more true to the standard than that implemented by xargs
. Note that bash/ksh extensions such as $'foo'
are not honored.
shlex_split() {
python -c '
import shlex, sys
for item in shlex.split(sys.stdin.read()):
sys.stdout.write(item + "\0")
'
}
while IFS= read -r -d '' entry; do
foo+=( "$entry" )
done < <(shlex_split <junk)
These answers pose a security risk:
...specifically, if the contents of junk
can be written to contain shell-sensitive code (like $(rm -rf /)
), you don't want to use either of them:
# use declare
declare "foo=($(cat junk))"
# ...or use eval directly
eval "foo=( $(cat junk) )"
If you want to be sure that foo
is written in a way that's safe to read in this way, and you control the code that writes to it, consider:
# write foo array to junk in an eval-safe way, if it contains at least one element
{ printf '%q ' "${foo[@]}" && printf '\n'; } >junk;
Alternately, you could use:
# write a command which, when evaluated, will recreate the variable foo
declare -p foo >junk
and:
# run all commands in the file junk
source junk
Call bash function with array of arguments including multiline string
Change my_func ${ARGS[@]}
to my_func "${ARGS[@]}"
Without the enclosing double quotes, the arguments get expanded and shell removes the trailing new lines.
Array of arguments to call m4 from bash
Always quote your variable/array expansions unless you have a reason not to (which is never the case mostly)
m4 "${args[@]}" sample.m4
An unquoted expansion has caused the words in the array to be split and ended up getting unequal number of parameters to m4
command.
Expansion of variables inside single quotes in a command in Bash
Inside single quotes everything is preserved literally, without exception.
That means you have to close the quotes, insert something, and then re-enter again.
'before'"$variable"'after'
'before'"'"'after'
'before'\''after'
Word concatenation is simply done by juxtaposition. As you can verify, each of the above lines is a single word to the shell. Quotes (single or double quotes, depending on the situation) don't isolate words. They are only used to disable interpretation of various special characters, like whitespace, $
, ;
... For a good tutorial on quoting see Mark Reed's answer. Also relevant: Which characters need to be escaped in bash?
Do not concatenate strings interpreted by a shell
You should absolutely avoid building shell commands by concatenating variables. This is a bad idea similar to concatenation of SQL fragments (SQL injection!).
Usually it is possible to have placeholders in the command, and to supply the command together with variables so that the callee can receive them from the invocation arguments list.
For example, the following is very unsafe. DON'T DO THIS
script="echo \"Argument 1 is: $myvar\""
/bin/sh -c "$script"
If the contents of $myvar
is untrusted, here is an exploit:
myvar='foo"; echo "you were hacked'
Instead of the above invocation, use positional arguments. The following invocation is better -- it's not exploitable:
script='echo "arg 1 is: $1"'
/bin/sh -c "$script" -- "$myvar"
Note the use of single ticks in the assignment to script
, which means that it's taken literally, without variable expansion or any other form of interpretation.
Related Topics
How to Use Both 64 Bit and 32 Bit Instructions in the Same Executable in 64 Bit Linux
Is It Ok (Performance-Wise) to Have Hundreds or Thousands of Files in the Same Linux Directory
Find All Writable Files in the Current Directory
Cuda Performance Penalty When Running in Windows
Docker Not Responding to Ctrl+C in Terminal
How to Print a String to the Terminal in X86-64 Assembly (Nasm) Without Syscall
Logcat Show Invisible Messages in Eclipse Mars
In Linux, How to Test Whether the Output of a Program Is Going to a Live Terminal or to a File
Accessing Shell Variable in a Perl Program
Behavior of Cd/Bash on Symbolic Links
Building Arm Gnu Cross Compiler
Docker Copy with File Globbing
What Is the Use of File Descriptor 255 in Bash Process
Using Perf Probe to Monitor Performance Stats During a Particular Function