Running Multiple Bash Commands with Subprocess

running multiple bash commands with subprocess

You have to use shell=True in subprocess and no shlex.split:

import subprocess

command = "echo a; echo b"

ret = subprocess.run(command, capture_output=True, shell=True)

# before Python 3.7:
# ret = subprocess.run(command, stdout=subprocess.PIPE, shell=True)

print(ret.stdout.decode())

returns:

a
b

How do I run multiple commands with python subprocess( ) without waiting for the end of each command?

Switch out your subprocess.run(command_lst) with Popen(command_lst, shell=True) in each of your scripts and and loop through the command list like the example below to run the processes in parallel.

This is how you implement Popen to run processes in parallel using arbitrary commands for simplicity.

from subprocess import Popen

commands = ['ls -l', 'date', 'which python']

processes = [Popen(cmd, shell=True) for cmd in commands]

How do I execute multiple shell commands with a single python subprocess call?

Use semicolon to chain them if they're independent.

For example, (Python 3)

>>> import subprocess
>>> result = subprocess.run('echo Hello ; echo World', shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
>>> result
CompletedProcess(args='echo Hello ; echo World', returncode=0, stdout=b'Hello\nWorld\n')

But technically that's not a pure Python solution, because of shell=True. The arg processing is actually done by shell. (You may think of it as of executing /bin/sh -c "$your_arguments")

If you want a somewhat more pure solution, you'll have to use shell=False and loop over your several commands. As far as I know, there is no way to start multiple subprocesses directly with subprocess module.

multiple bash commands in a single line in python

You can use the subprocess python library

import subprocess

command = "cd /home/ ; ls -lrt abc* ; cp abc* /destination/ ; ...."

ret = subprocess.run(command, capture_output=True, shell=True)

print(ret.stdout.decode())

https://docs.python.org/3/library/subprocess.html

Run multiple bash commands simultaneously in Python

you're just running one bash with 3 commands in it.

If the commands aren't setting variables or depending from each other (else you would not be able to parallelize them), maybe you could create 3 subprocess.Popen instances instead:

commands = '''
bashcmd1
bashcmd2
bashcmd3
'''

for process in [subprocess.Popen(['/bin/bash', '-c', line], stdout=subprocess.PIPE)
for line in commands.split("\n") if line]: # filter out blank lines
out, err = process.communicate() # or just rc = process.wait()
# print out & err

that command first create a list comprehension of Popen objects (list not generator so the processes start immediately), then perform a communicate to wait for completion (but other processes are running in the meanwhile)

The upside is that you can apply this technique to any script containing commands, and you don't need to use the shell & capability (more portable, including Windows provided you're using ["cmd","/c" prefix instead of bash)

Running multiple bash commands with individual output files in a single subprocess calls

A version of your code which tries to be as safe as possible might look like:

import shlex, pipes
if hasattr(shlex, 'quote'):
quote = shlex.quote # Python 3
else:
quote = pipes.quote # Python 2

inner_cmds = [
[
'program',
'--arg1', 'foo',
'--arg2', 'bar',
'--output', 'out1.txt'
], [
'program',
'out1.txt',
'--arg3', 'baz',
'--output', 'out2.csv'
]
]

inner_cmd_str = ' && '.join(' '.join(shlex.quote(word) for word in inner_cmd) for inner_cmd in inner_cmds)
cmd = [
'psrecord', inner_cmd_str,
'--log', 'activity.txt',
'--plot', 'plot.png'
]

p = subprocess.Popen(cmd)


Related Topics



Leave a reply



Submit