How to Pass a Wildcard Parameter to a Bash File

How do I pass a wildcard parameter to a bash file

The parent shell, the one invoking bash show_files.sh *, expands the * for you.

In your script, you need to use:

for dir in "$@"
do
echo "$dir"
done

The double quotes ensure that multiple spaces etc in file names are handled correctly.

See also How to iterate over arguments in a bash shell script.


Potentially confusing addendum

If you're truly sure you want to get the script to expand the *, you have to make sure that * is passed to the script (enclosed in quotes, as in the other answers), and then make sure it is expanded at the right point in the processing (which is not trivial). At that point, I'd use an array.

names=( $@ )
for file in "${names[@]}"
do
echo "$file"
done

I don't often use $@ without the double quotes, but this is one time when it is more or less the correct thing to do. The tricky part is that it won't handle wild cards with spaces in very well.

Consider:

$ > "double  space.c"
$ > "double space.h"
$ echo double\ \ space.?
double space.c double space.h
$

That works fine. But try passing that as a wild-card to the script and ... well, let's just say it gets to be tricky at that point.

If you want to extract $2 separately, then you can use:

names=( $1 )
for file in "${names[@]}"
do
echo "$file"
done
# ... use $2 ...

How to pass a wildcard parameter to a bashrc function

When you call chkerrors /var/log/messages*, the glob will be expanded by bash before it calls the function, e.g. the actual function call is chkerrors /var/log/messages1 /var/log/messages2 /var/log/messages3.

That means the function receives multiple parameters, but you only handle the first one, $1. You will instead want to handle all its parameters using "$@" :

chkerrors () { egrep -i 'page allocation failure|oom-killer|soft lockup|blocked for more' "$@"; }

"$@" is special in that it doesn't expand to a single word as the quotes generally imply, but rather to a list of quoted words, so each file matched by the glob will be treated as an additional parameter of your egrep command, and files containing character of the IFS will correctly be treated as a single parameter rather than splitted in two.

Bash : Taking a wildcard as argument of a shell script and being able to expand it

Write your function to accept multiple arguments instead of just one. This is how most other tools work (cp/mv/ls/grep/sed to name a few):

multiple_files() {
first="$1"
last="${@: -1}"
files=( "${@:2:$#-2}" )

echo "The first thing was $first"
echo "The last thing was $last"
for file in "${files[@]}"
do
echo "One of the files is $file"
done
}

multiple_files . probe_*.txt dir/

This results in:

The first thing was .
The last thing was dir/
One of the files is probe_GCph.txt
One of the files is probe_GCsp.txt
One of the files is probe_GCsp_WL_GCph.txt
One of the files is probe_GCsp_XC.txt

If you actually need the pattern itself, or if you want to accept multiple patterns and keep them separate, you'll probably need to quote the glob as described in the other answers.

How to access literal wildcard argument before it's expanded to matching files?

You can't do this, since the shell expands any wildcards in the command line before your script even starts. When you enter ./myscript file.conf *.data dest_foder in the shell, this is effectively just a shorthand for ./myscript file.conf this.data that.data so.data dest_foder, not a different command.

If you need the wildcard passed into the command as an actual argument, you need to quote or escape it. Something like ./myscript file.conf '*.data' dest_foder or ./myscript file.conf \*.data dest_foder. Alternately, make the last argument mandatory, or turn it into an option (-d dest_folder) so the preexpanded file list isn't a problem.

Put it another way: it'd be very convenient if you didn't have to quote/escape wildcards in your grep patterns; but you have to, because there's no way for the grep command to get at its arguments in unexpanded form. And if the authors of grep couldn't figure out how to make their command more convenient, there's no way you're going to be able to do it...

BASH parameters with wildcard

When you're checking for multiple files with -f or -e it can get nasty. I recommend kenfallon's blog. This is something like what he would recommend:

#! /bin/bash

ls -l /home/user/bashTest/$1*.jpg > /dev/null
if [ "$?" = "0" ]
then
cp /home/user/bashTest/$1*.jpg /home/user/bashTest/final/
fi

Not sure how the $@ would play in here, or if it's required.

passing wildcard arguments from bash into python

You have two options, both of which involve a loop.

To pass the files one by one, use a shell loop:

for file in *; do python script.py "$file"; done

This will invoke your script once for every file matching the glob *.

To process multiple files in your script, use a loop there instead:

from sys import argv
for filename in argv[1:]:
# rest of script

Then call your script from bash like python script.py * to pass all the files as arguments. argv[1:] is an array slice, which returns a list containing the elements from argv starting from position 1 to the end of the array.

I would suggest the latter approach as it means that you are only invoking one instance of your script.

How to surround find's -name parameter with wildcards before and after a variable?

Use double quotes to prevent the asterisk from being interpreted as an instruction to the shell rather than find.

-name "*$line*"

Thus:

while read -r line; do
line=${line%$'\r'} # strip trailing CRs if input file is in DOS format
find "$SEARCH_DIR" -name "*$line*"
done <"$INPUT" >>"$OUTPUT"

...or, better:

#!/usr/bin/env bash

## use lower-case variable names
input=$1
output=$2

args=( -false ) # for our future find command line, start with -false
while read -r line; do
line=${line%$'\r'} # strip trailing CR if present
[[ $line ]] || continue # skip empty lines
args+=( -o -name "*$line*" ) # add an OR clause matching if this line's substring exists
done <"$input"

# since our last command is find, use "exec" to let it replace the shell in memory
exec find "$SEARCH_DIR" '(' "${args[@]}" ')' -print >"$output"

Note:

  • The shebang specifying bash ensures that extended syntax, such as arrays, are available.
  • See BashFAQ #50 for a discussion of why an array is the correct structure to use to collect a list of command-line arguments.
  • See the fourth paragraph of http://pubs.opengroup.org/onlinepubs/9699919799/basedefs/V1_chap08.html for the relevant POSIX specification on environment and shell variable naming conventions: All-caps names are used for variables with meaning to the shell itself, or to POSIX-specified tools; lowercase names are reserved for application use. That script you're writing? For purposes of the spec, it's an application.

Using a Wildcard Match as a Command Line Argument in Bash

Does this do what you need:

dest='/desktop/'
for ARG in "$@"; do
/some/other/script "$ARG" "$dest$ARG.new"
done

EDIT: To remove the path on ARG

dest='/desktop/'
for ARG in "$@"; do
/some/other/script "$ARG" "$dest$(basename "$ARG").new"
done


Related Topics



Leave a reply



Submit