How to Make a Bash Shell Script Interact with Another Command Line Program

Is it possible to make a bash shell script interact with another command line program?

If your command doesn't care how fast you give it input, and you don't really need to interact with it, then you can use a heredoc.


prog <<EOD
save filex
save filey

If you need branching based on the output of the program, or if your program is at all sensitive to the timing of your commands, then Expect is what you want.

Pass commands as input to another command (su, ssh, sh, etc)

A shell script is a sequence of commands. The shell will read the script file, and execute those commands one after the other.

In the usual case, there are no surprises here; but a frequent beginner error is assuming that some commands will take over from the shell, and start executing the following commands in the script file instead of the shell which is currently running this script. But that's not how it works.

Basically, scripts work exactly like interactive commands, but how exactly they work needs to be properly understood. Interactively, the shell reads a command (from standard input), runs that command (with input from standard input), and when it's done, it reads another command (from standard input).

Now, when executing a script, standard input is still the terminal (unless you used a redirection) but the commands are read from the script file, not from standard input. (The opposite would be very cumbersome indeed - any read would consume the next line of the script, cat would slurp all the rest of the script, and there would be no way to interact with it!) The script file only contains commands for the shell instance which executes it (though you can of course still use a here document etc to embed inputs as command arguments).

In other words, these "misunderstood" commands (su, ssh, sh, sudo, bash etc) when run alone (without arguments) will start an interactive shell, and in an interactive session, that's obviously fine; but when run from a script, that's very often not what you want.

All of these commands have ways to accept commands by ways other than in an interactive terminal session. Typically, each command supports a way to pass it commands as options or arguments:

su root -c 'who am i'
ssh user@remote uname -a
sh -c 'who am i; echo success'

Many of these commands will also accept commands on standard input:

printf 'uname -a; who am i; uptime' | su
printf 'uname -a; who am i; uptime' | ssh user@remote
printf 'uname -a; who am i; uptime' | sh

which also conveniently allows you to use here documents:

ssh user@remote <<'____HERE'
uname -a
who am i

sh <<'____HERE'
uname -a
who am i

For commands which accept a single command argument, that command can be sh or bash with multiple commands:

sudo sh -c 'uname -a; who am i; uptime'

As an aside, you generally don't need an explicit exit because the command will terminate anyway when it has executed the script (sequence of commands) you passed in for execution.

Sending commands to application's shell using bash script

I think it might be because here-docs do not wait for output. Unfortunately for you I switched company, thus can't test my code below.

#! /bin/bash
expect <<-EOF
set timeout -1
spawn JLinkExe
expect "J-Link> " { send "connect\r" }
expect "J-Link> " { send "\r" }
expect "J-Link> " { send "\r" }
expect "J-Link> " { send "\r" }
expect "J-Link> " { send "\r" }
expect "J-Link> " { send "erase\r" }
expect "J-Link> " { send "loadbin program.bin , 0x0\r" }
expect "J-Link> " { send "r\r" }
expect "J-Link> " { send "q\r" }
expect eof
catch wait result
exit [lindex \$result 3]
exit $?

Except waits until J-Link> turns up and then sends the command through the connection.

If it doesn't work please notify me. I'll try to help you after the weekend :-)


A: Why did you wrap everything in expect 2>&1 <<-EOF and EOF?

You can add expect in the shebang, but I often use it as part of my Bash scripts. My knowledge of Bash is better.

B: Why a -EOF instead of EOF?

That's because <<-EOF allows leading tabs when you want to end the here-doc. You can indent it in functions for instance.

C: Why did you redirect stderr to stdout (2>&1)?

In your case I should've removed this. I took the code from one of my other answer about expect and tailored it to your needs.

D: What does catch wait result and exit [lindex \$result 3] do after we catch the eof?

Nice question, I had to look this one up a little myself:

  • lindex takes 4rd argument in \$result and exits the here-doc (0 is arg 1).
  • \$result is set by catch wait result.
    • Catch takes the output of wait and puts that into result.
    • Wait returns four integers:
      • First: pid of process that's being waited on.
      • Second: spawn ID.
      • Third: -1 for errors, 0 otherwise.
      • Forth: Exit status of the program as set by the OS.
  • Sources:

Note that you have to escape the $ in the here-doc, otherwise Bash tries to process it. Hence \$result.

E: Why you exit with exit $?

Bash exits a script with the last known error code. Although you can leave it implicitly, I like to add it anyhow. It keeps the script more readable for beginners.

Calling one Bash script from another Script passing it arguments with quotes and spaces

Quote your args in Testscript 1:

echo "TestScript1 Arguments:"
echo "$1"
echo "$2"
echo "$#"
./testscript2 "$1" "$2"

Run bash script from another script without waiting for script to finish executing?

Put & at the end of the line.

./ & #this doesn't blocks!

Running multiple commands in one line in shell

You are using | (pipe) to direct the output of a command into another command. What you are looking for is && operator to execute the next command only if the previous one succeeded:

cp /templates/apple /templates/used && cp /templates/apple /templates/inuse && rm /templates/apple


cp /templates/apple /templates/used && mv /templates/apple /templates/inuse

To summarize (non-exhaustively) bash's command operators/separators:

  • | pipes (pipelines) the standard output (stdout) of one command into the standard input of another one. Note that stderr still goes into its default destination, whatever that happen to be.
  • |&pipes both stdout and stderr of one command into the standard input of another one. Very useful, available in bash version 4 and above.
  • && executes the right-hand command of && only if the previous one succeeded.
  • || executes the right-hand command of || only it the previous one failed.
  • ; executes the right-hand command of ; always regardless whether the previous command succeeded or failed. Unless set -e was previously invoked, which causes bash to fail on an error.

Related Topics

Leave a reply