How to Pass Shell Variables as Command Line Argument to a Shell Script

How to pass shell variables as Command Line Argument to a shell script

Bash scripts take arguments after the call of the script not before so you need to call the script like this:

./StatCollection_DBServer.sh DD 50

inside the script, you can access the variables as $1 and $2 so the script could look like this:

#!/bin/bash
LOG_DIRECTORY="${1}_${2}users"
mkdir -m 777 "${LOG_DIRECTORY}"

I hope this helps...

Edit:
Just a small explanation, what happened in your approach:

prodName='DD' users=50 ./StatCollection_DBServer.sh

In this case, you set the environment variables prodName and users before calling the script. That is why you were able to use these variables inside your code.

Assign command line argument to variable in shell script

The problem is this line: dirname=$dir_no

You're clobbering your directory name with the directory number. This will not get you the value of $1, $2, etc. This gets you a number (1, 2, etc).

If you want the variables $1, $2, etc, you need to use a variable variable:

dirname=${!dir_no}

How to pass shell variable as a command line arguments in C program

"../programming/ctest/arg $i"

You need to remove quotes good sir

../programming/ctest/arg $i

To elaborate, without removing the quotes Bash will interpret the entire string as a command, instead of a command and argument like you intended.

Pass arguments from command Line and from function inside shell script

You can forward the original arguments with:

...
coreExtraction () {
extractZipFiles "$@" some/location some/other/location
}
coreExtraction "$@"
...

To access the original script arguments from inside the function, you have to save them before you call the function, for instance, in an array:

args=("$@")
some_function some_other_args

Inside some_function, the script args will be in ${args[0]}, ${args[1]}, and so on. Their number will be ${#a[@]}.

How to pass command line arguments with spaces through a variable in bash

If you want to execute your command once for each line found in file.txt, so each line is a separate argument set, you can do this :

xargs /some/command <file.txt

The xargs utility takes each line it receives on standard input and uses its content as arguments to be provided to the command that is called. If the file contains only one line, it will work and execute the command only once.

The following solution does the same, but works with functions too:

while IFS= read -r line
do
eval args=\("$line"\)
command_or_function "${args[@]}"
done<file.txt

Please note that this uses eval, which means that if file.txt contains malicious content, arbitrary code execution could result. You must be 100% certain that the data contained in the file is safe.

The idea with this technique is that you explode each line into an array (one array element is one argument), and then use an array expansion ("${args[@]}") that expands to a list of all its elements, properly quoted (the quotes around the expansion are important here).

As an aside, the eval line could be replaced with :

declare -a args=\($line\)

But $line still gets expanded, so this is no safer than eval.

How do I pass all command line arguments given to a bash script including string parameters as-is to a child process?

Use this:

python "$@"

$@ and $* both expand to all the arguments that the script received, but when you use $@ and put it in double quotes, it automatically re-quotes everything so it works correctly.

From the bash manual:

Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands to a separate word. That is, "$@" is equivalent to "$1" "$2" ….

See also Why does $@ work different from most other variables in bash?



Related Topics



Leave a reply



Submit