Shell Programming: Executing Two Applications at The Same Time

How do you run multiple programs in parallel from a bash script?

To run multiple programs in parallel:

prog1 &
prog2 &

If you need your script to wait for the programs to finish, you can add:

wait

at the point where you want the script to wait for them.

How can I run two programs at the same time, but when they are in different directories? (Tcsh shell)

You can start the commands like this:

(./run.sh &) && (cd Parallella/parallella-examples/aobench; ./run.sh &)

Run a shell script multiple times at the same time

The simplest of the ways to do this would be to use the & notation to put them as background processes.

for arg in xaa xab xac xad; do
./test.sh "$arg" &
done

You could also read up the parallelization techniques provided by xargs or GNU parallel for more computation intensive tasks.

starting multiple programs at the same time

Just to explain: The reason for your Java program stopping is that all background processes will receive a HUP signal when the shell executing your script terminates. To prevent this, you can remove it from the list of the shell's jobs using disown right after the &.

For the two Java programs, I would suggest:

java -jar server.jar &          # start java with server.jar as a background job
echo "$!" > "$pidfile" # echo the pid of the last background job into the pid file
java -jar heartbeat.jar & # start java with heartbeat.jar as a background job
fg %1 # set server.jar as foreground job for interactive use

Edit: wait replaced with fg %1, see comments.

Executing single shell script parallelly at same time with different arguments

Much simplified, and thought as a startig point, only. Google for (unix) address space, or (unix) process image.

Most (all?) modern operating systems virtualize the memory (RAM), and call this address space, process image, or the like.

Each program is started in its own virtual address space, or process image, and it can only access the memory of its own address space. From a memory point of view this is as of each program was run on its own distinct computer (remember that I'm simplifying things).

Two programs running in parallel, cannot access data in each other's address space, unless both programs have agreed for some sharing by using specific operating system services.

Back to your question: When you start multiple instances of a single program (a script is a program), a copy of this program is loaded into its own address space. Since the concept of the address space isolates the memory, all program data is isolated, too.

So, there is no danger that any data, e.g variables, of one program instance can be seen or modified by another program instance, unless the programs agreed upon some sharing (using operating system services).

HTH

Have bash script execute multiple programs as separate processes

You can run a job in the background like this:

command &

This allows you to start multiple jobs in a row without having to wait for the previous one to finish.

If you start multiple background jobs like this, they will all share the same stdout (and stderr), which means their output is likely to get interleaved. For example, take the following script:

#!/bin/bash
# countup.sh

for i in `seq 3`; do
echo $i
sleep 1
done

Start it twice in the background:

./countup.sh &
./countup.sh &

And what you see in your terminal will look something like this:

1
1
2
2
3
3

But could also look like this:

1
2
1
3
2
3

You probably don't want this, because it would be very hard to figure out which output belonged to which job. The solution? Redirect stdout (and optionally stderr) for each job to a separate file. For example

command > file &

will redirect only stdout and

command > file 2>&1 &

will redirect both stdout and stderr for command to file while running command in the background. This page has a good introduction to redirection in Bash. You can view the command's output "live" by tailing the file:

tail -f file

I would recommend running background jobs with nohup or screen as user2676075 mentioned to let your jobs keep running after you close your terminal session, e.g.

nohup command1 > file1 2>&1 &
nohup command2 > file2 2>&1 &
nohup command3 > file3 2>&1 &

Executing two programs concurrently

server &
PID=$!
client
kill $PID

Executing two programs concurrently

server &
PID=$!
client
kill $PID


Related Topics



Leave a reply



Submit