Linux Bash Script Running Multiple Python

linux bash script running multiple python

This by default will already run one after the other.


To check that python a.py completed successfully as a required condition for running python b.py, you can do:

#!/usr/bin/env bash
python a.py && python b.py

Conversely, attempt to run python a.py, and ONLY run 'python b.py' if python a.py did not terminate successfully:

#!/usr/bin/env bash
python a.py || python b.py

To run them at the same time as background processes:

#!/usr/bin/env bash
python a.py &
python b.py &

(Responding to comment) - You can chain this for several commands in a row, for example:

python a.py && python b.py && python c.py && python d.py 

How to run multiple python scripts from shell one after another

To run sequentially:

#!/bin/bash
/home/path_to_script/dimAO.py
/home/path_to_script/dimA1.py
/home/path_to_script/dimA2.py
/home/path_to_script/dimA3.py

To run them all in parallel:

#!/bin/bash
/home/path_to_script/dimAO.py &
/home/path_to_script/dimA1.py &
/home/path_to_script/dimA2.py &
/home/path_to_script/dimA3.py &

Use redirection (> or >>) to redirect stdout and stderr, as desired.

How to run multiple Python/Shell scripts from one script

I would do this:

#!/usr/bin/env python

import subprocess

subprocess.run(['python', 'script1.py'])
subprocess.run(['python', 'script2.py'])
subprocess.run(['python', 'script3.py'])

If you only want each script to run if the previous one was successful:

#!/usr/bin/env python

import subprocess

subprocess.run('python script1.py && python script2.py && python script3.py', shell=True)

I am using shell=True here because I am relying on the shell to interpret the && and only let the next process run if the previous one was successful.


If you want them all to run in parallel with each other, and in the background:

#!/usr/bin/env python

import subprocess

subprocess.run('python script1.py &', shell=True)
subprocess.run('python script2.py &', shell=True)
subprocess.run('python script3.py &', shell=True)

I am using shell=True here because I am relying on the shell to interpret the & to mean that I want the processes to run run in the background so that I can carry on doing something else while they run.


In general, I wouldn't use Python at all for this, I would write a bash script like this:

#!/bin/bash

python script1.py
python script2.py
python script3.py

Also, in general, I would make the first line of a Python script a shebang like this:

#!/usr/bin/env python

print('I am a Python script with shebang')

then I would make the script executable with:

chmod +x script.py

Now, instead of running it with:

python script.py

the kernel knows which interpreter to use so I don't have to tell it every time and I can simply run it with:

script.py

if the directory it is located in is on my PATH. Or, if it is not on my PATH, I'd need:

/path/to/script.py

Run multiple python scripts concurrently

With Bash:

python script1.py &
python script2.py &

That's the entire script. It will run the two Python scripts at the same time.

Python could do the same thing itself but it would take a lot more typing and is a bad choice for the problem at hand.

I think it's possible though that you are taking the wrong approach to solving your problem, and I'd like to hear what you're getting at.

Running multiple python scripts sequentially with nohup

What you probably want, is the following: remove the nohup from the commands in your shell script, but run the overarching script that you show here (i.e., all the code; let's call it iterations.bash with nohup and in the background: nohup bash iterations.bash >& iterations.log &. Now you have your command line back, while the processes inside the script are run sequentially.

How to run multiple python scripts simultaneously from a wrapper script in such a way that CPU utilization is maximized?

This is a technique I developed for calling many external programs using subprocess.Popen. In this example, I'm calling convert make JPEG images from DICOM files.

In short; it uses manageprocs to keep checking a list of running subprocesses. If one is finished, it is removed and a new one is started as long as unprocesses files remain. After that, the remaining processes are watched until they are all finished.

from datetime import datetime
from functools import partial
import argparse
import logging
import os
import subprocess as sp
import sys
import time

def main():
"""
Entry point for dicom2jpg.
"""
args = setup()
if not args.fn:
logging.error("no files to process")
sys.exit(1)
if args.quality != 80:
logging.info(f"quality set to {args.quality}")
if args.level:
logging.info("applying level correction.")
start_partial = partial(start_conversion, quality=args.quality, level=args.level)

starttime = str(datetime.now())[:-7]
logging.info(f"started at {starttime}.")
# List of subprocesses
procs = []
# Do not launch more processes concurrently than your CPU has cores.
# That will only lead to the processes fighting over CPU resources.
maxprocs = os.cpu_count()
# Launch and mange subprocesses for all files.
for path in args.fn:
while len(procs) == maxprocs:
manageprocs(procs)
procs.append(start_partial(path))
# Wait for all subprocesses to finish.
while len(procs) > 0:
manageprocs(procs)
endtime = str(datetime.now())[:-7]
logging.info(f"completed at {endtime}.")

def start_conversion(filename, quality, level):
"""
Convert a DICOM file to a JPEG file.

Removing the blank areas from the Philips detector.

Arguments:
filename: name of the file to convert.
quality: JPEG quality to apply
level: Boolean to indicate whether level adustment should be done.
Returns:
Tuple of (input filename, output filename, subprocess.Popen)
"""
outname = filename.strip() + ".jpg"
size = "1574x2048"
args = [
"convert",
filename,
"-units",
"PixelsPerInch",
"-density",
"300",
"-depth",
"8",
"-crop",
size + "+232+0",
"-page",
size + "+0+0",
"-auto-gamma",
"-quality",
str(quality),
]
if level:
args += ["-level", "-35%,70%,0.5"]
args.append(outname)
proc = sp.Popen(args, stdout=sp.DEVNULL, stderr=sp.DEVNULL)
return (filename, outname, proc)

def manageprocs(proclist):
"""Check a list of subprocesses for processes that have ended and
remove them from the list.

Arguments:
proclist: List of tuples. The last item in the tuple must be
a subprocess.Popen object.
"""
for item in proclist:
filename, outname, proc = item
if proc.poll() is not None:
logging.info(f"conversion of “{filename}” to “{outname}” finished.")
proclist.remove(item)
# since manageprocs is called from a loop, keep CPU usage down.
time.sleep(0.05)

if __name__ == "__main__":
main()

I've left out setup(); it's using argparse to deal with command-line arguments.

Here the thing to be processed is just a list of file names.
But it could also be (in your case) a list of tuples of script names and arguments.

Cron activate virtualenv and run multiple python scripts from shell script

Remember that & means to run the entire previous command asynchronously. This includes anything before a &&. Commands that run asynchronously run in separate processes.

To take a simplified example of your problem, let's say we asynchronously change directories, run pwd, and asynchronously run pwd again.

#!/bin/sh

cd / && \
pwd \
& pwd

On my computer, this outputs:

/home/nick
/

The cd / was meant to affect both pwd calls, but it only affected the first one, because the second one runs in a different process. (They also printed out of order in this case, the second one first.)

So, how can you write this script in a more robust fashion?

First, I would turn on strict error handling with -e. This exits as soon as any (non-asynchronous) command returns a non-zero exit code. Second, I would avoid the use of &&, because strict error handling deals with this. Third, I would use wait at the end to make sure the script doesn't exit until all of the sub-scripts have exited.

#!/bin/sh
set -e

cd /

pwd &
pwd &

wait

The general idea is that you turn on strict error handling, do all of your setup in a synchronous fashion, then launch your four scripts asynchronously, and wait for all to finish.

To apply this to your program:

#!/bin/sh
set -e
cd /home/ubuntu/virtualenvironment/scripts
source /home/ubuntu/venv3.8/bin/activate

python script1.py &
python script2.py &
python script3.py &
python script4.py &

wait

Run multiple python scripts with arguments in real time

Resolved this issue with GUI automation: See MultiPy on my GitHub.



Related Topics



Leave a reply



Submit