Running Two Python Scripts With Bash File

linux bash script running multiple python

This by default will already run one after the other.


To check that python a.py completed successfully as a required condition for running python b.py, you can do:

#!/usr/bin/env bash
python a.py && python b.py

Conversely, attempt to run python a.py, and ONLY run 'python b.py' if python a.py did not terminate successfully:

#!/usr/bin/env bash
python a.py || python b.py

To run them at the same time as background processes:

#!/usr/bin/env bash
python a.py &
python b.py &

(Responding to comment) - You can chain this for several commands in a row, for example:

python a.py && python b.py && python c.py && python d.py 

Running two python scripts with bash file

If you can install GNU Parallel on Windows under Git Bash (ref), then you can run the two scripts on separate CPUs this way:

▶ (cat <<EOF) | parallel --jobs 2
python script1.py
python script2.py
EOF

Note from the parallel man page:

   --jobs N
Number of jobslots on each machine. Run up to N jobs in parallel.
0 means as many as possible. Default is 100% which will run one job per
CPU on each machine.

Note that the question has been updated to state that parallelisation does not improve calculation time, which is not generally a correct statement.

While the benefits of parallelisation are highly machine- and workload-dependent, parallelisation significantly improves the processing time of CPU-bound processes on multi-core computers.

Here is a demonstration based on calculating 50,000 digits of Pi using Spigot's algorithm (code) on my quad-core MacBook Pro:

Single task (52s):

▶ time python3 spigot.py
...
python3 spigot.py 52.73s user 0.32s system 98% cpu 53.857 total

Running the same computation twice in GNU parallel (74s):

▶ (cat <<EOF) | time parallel --jobs 2                                                                                                                                   
python3 spigot.py
python3 spigot.py
EOF
...
parallel --jobs 2 74.19s user 0.48s system 196% cpu 37.923 total

Of course this is on a system that is busy running an operating system and all my other apps, so it doesn't halve the processing time, but it is a big improvement all the same.

See also this related Stack Overflow answer.

How to run multiple Python/Shell scripts from one script

I would do this:

#!/usr/bin/env python

import subprocess

subprocess.run(['python', 'script1.py'])
subprocess.run(['python', 'script2.py'])
subprocess.run(['python', 'script3.py'])

If you only want each script to run if the previous one was successful:

#!/usr/bin/env python

import subprocess

subprocess.run('python script1.py && python script2.py && python script3.py', shell=True)

I am using shell=True here because I am relying on the shell to interpret the && and only let the next process run if the previous one was successful.


If you want them all to run in parallel with each other, and in the background:

#!/usr/bin/env python

import subprocess

subprocess.run('python script1.py &', shell=True)
subprocess.run('python script2.py &', shell=True)
subprocess.run('python script3.py &', shell=True)

I am using shell=True here because I am relying on the shell to interpret the & to mean that I want the processes to run run in the background so that I can carry on doing something else while they run.


In general, I wouldn't use Python at all for this, I would write a bash script like this:

#!/bin/bash

python script1.py
python script2.py
python script3.py

Also, in general, I would make the first line of a Python script a shebang like this:

#!/usr/bin/env python

print('I am a Python script with shebang')

then I would make the script executable with:

chmod +x script.py

Now, instead of running it with:

python script.py

the kernel knows which interpreter to use so I don't have to tell it every time and I can simply run it with:

script.py

if the directory it is located in is on my PATH. Or, if it is not on my PATH, I'd need:

/path/to/script.py

How to run multiple python scripts from shell one after another

To run sequentially:

#!/bin/bash
/home/path_to_script/dimAO.py
/home/path_to_script/dimA1.py
/home/path_to_script/dimA2.py
/home/path_to_script/dimA3.py

To run them all in parallel:

#!/bin/bash
/home/path_to_script/dimAO.py &
/home/path_to_script/dimA1.py &
/home/path_to_script/dimA2.py &
/home/path_to_script/dimA3.py &

Use redirection (> or >>) to redirect stdout and stderr, as desired.

Using Bash to run 2 python scripts at the same time

Looks like you have mixed between python and bash,
you don't need the import in the bash script.

#!/usr/bin/env bash
python camera_centroid.py &
python testsss.py &
wait # wait for jobs to be done

make sure you adding execute permissions to the scripts

chmod +x testsss.py camera_centroid.py

and finally run the script ./your_file.sh

Running parallel python scripts from bash file

just put & behind your command

python train.py --arg1 &
python train.py --arg2 &
python train.py --arg3 &
python train.py --arg4 &

it put your jobs in the background
for more informations visit this wiki wiki

edit:
for work in batches there exist multiple ways

one example could be to count the running background jobs, i.e:

sleep 1001 &
sleep 1002 &
sleep 1003 &

jobs # for listing all running background jobs
>[1] running sleep 1001 &
>[2]- running sleep 1002 &
>[3]+ running sleep 1003 &

now you can count them with wc like

# count lines of output
jobs | wc -l
> 3

now all together in a loop:

while true
do

# check if 3 background jobs are running
while [ $(jobs | wc -l) -ne 3 ]
do
sleep 10 &
done
sleep 1 # check every second if 3 jobs are running

done

Running multiple python scripts sequentially with nohup

What you probably want, is the following: remove the nohup from the commands in your shell script, but run the overarching script that you show here (i.e., all the code; let's call it iterations.bash with nohup and in the background: nohup bash iterations.bash >& iterations.log &. Now you have your command line back, while the processes inside the script are run sequentially.

how to run two python scripts parallelly using shell script and save their return value in a variable

You can run the process on the background using & operator. Then you have to wait for all the background processes using wait. This command also provides return code of the awaited process.

Quick and dirty example:

#!/bin/bash

# first command to be executed
sleep 3 &
pid1=$!

# second command to be executed
sleep 5 &
pid2=$!

wait $pid1
x=$?

wait $pid2
y=$?

echo "x: $x, y: $y"


Related Topics



Leave a reply



Submit