How to Use 'Subprocess' Command With Pipes

How to use `subprocess` command with pipes

To use a pipe with the subprocess module, you have to pass shell=True.

However, this isn't really advisable for various reasons, not least of which is security. Instead, create the ps and grep processes separately, and pipe the output from one into the other, like so:

ps = subprocess.Popen(('ps', '-A'), stdout=subprocess.PIPE)
output = subprocess.check_output(('grep', 'process_name'), stdin=ps.stdout)
ps.wait()

In your particular case, however, the simple solution is to call subprocess.check_output(('ps', '-A')) and then str.find on the output.

python subprocess.call and pipes

This is a class that will run a command with an arbitrary number of pipes:

pipeline.py

import shlex
import subprocess

class Pipeline(object):
def __init__(self, command):
self.command = command
self.command_list = self.command.split('|')
self.output = None
self.errors = None
self.status = None
self.result = None

def run(self):
process_list = list()
previous_process = None
for command in self.command_list:
args = shlex.split(command)
if previous_process is None:
process = subprocess.Popen(args, stdout=subprocess.PIPE)
else:
process = subprocess.Popen(args,
stdin=previous_process.stdout,
stdout=subprocess.PIPE)
process_list.append(process)
previous_process = process
last_process = process_list[-1]
self.output, self.errors = last_process.communicate()
self.status = last_process.returncode
self.result = (0 == self.status)
return self.result

This example shows how to use the class:

harness.py

from pipeline import Pipeline

if __name__ == '__main__':
command = '|'.join([
"sort %s",
"uniq",
"sed -e 's/bigstring of word/ smaller /'",
"column -t -s '=>'"
])
command = command % 'sample.txt'
pipeline = Pipeline(command)
if not pipeline.run():
print "ERROR: Pipeline failed"
else:
print pipeline.output

I created this sample file to for testing:

sample.txt

word1>word2=word3
list1>list2=list3
a>bigstring of word=b
blah1>blah2=blah3

Output

a       smaller   b
blah1 blah2 blah3
list1 list2 list3
word1 word2 word3

Correct way in Python to use subprocess.run when PIPE is in the command

The correct way to do this is to limit yourself to the minimum amount of non-Python executables. In this case, echo isn't necessary, Python can do the work sed is doing, and can write the resulting file as well. A clean solution would be something like:

import subprocess

with open(f'{host}_{port}.cert', 'wb') as outf,\
subprocess.Popen(['openssl', 's_client', '-connect', f'{host}:{port}'], stdin=subprocess.DEVNULL, stdout=subprocess.PIPE, stderr=subprocess.STDOUT) as proc:
for line in proc.stdout:
if b'-BEGIN CERTIFICATE-' in line:
outf.write(line)
break
else:
raise ValueError("BEGIN CERTIFICATE not found in output")

for line in proc.stdout:
outf.write(line)
if b'-END CERTIFICATE-' in line:
break
else:
raise ValueError("END CERTIFICATE not found in output")

I switched to subprocess.Popen instead since it allows you to process the output in a streaming fashion (the same way the shell pipes would work), but given the relatively small output, subprocess.run would likely work just fine too. The ValueErrors aren't strictly necessary, but I like having them there so you fail hard when the output isn't what you expect.

Subprocess call with pipe in it

You can do the following:

ps = subprocess.Popen(('ps', 'aux'), stdout=subprocess.PIPE)
output = subprocess.check_output(('grep', 'python'), stdin=ps.stdout)
ps.wait()

print output

Sending piped commands via python3 subprocess

The answer can be found in the subprocess documentation.

The functions from the subprocess module normally do not call a shell to interpret the commands, but rather invoke them directly with the given arguments! This behaviour can be overidden by using the argument shell=True (example from the Python documentation):

output = check_output("dmesg | grep hda", shell=True)

However, this is not advisable if the command and arguments are not fix but depends on user input. Then, the correct way to do this is to use two Popen calls and construct the pipeline by hand (code example again from the Python documentation):

p1 = Popen(["dmesg"], stdout=PIPE)
p2 = Popen(["grep", "hda"], stdin=p1.stdout, stdout=PIPE)
p1.stdout.close() # Allow p1 to receive a SIGPIPE if p2 exits.
output = p2.communicate()[0]

How do I use subprocess.Popen to connect multiple processes by pipes?

You'd be a little happier with the following.

import subprocess

awk_sort = subprocess.Popen( "awk -f script.awk | sort > outfile.txt",
stdin=subprocess.PIPE, shell=True )
awk_sort.communicate( b"input data\n" )

Delegate part of the work to the shell. Let it connect two processes with a pipeline.

You'd be a lot happier rewriting 'script.awk' into Python, eliminating awk and the pipeline.

Edit. Some of the reasons for suggesting that awk isn't helping.

[There are too many reasons to respond via comments.]

  1. Awk is adding a step of no significant value. There's nothing unique about awk's processing that Python doesn't handle.

  2. The pipelining from awk to sort, for large sets of data, may improve elapsed processing time. For short sets of data, it has no significant benefit. A quick measurement of awk >file ; sort file and awk | sort will reveal of concurrency helps. With sort, it rarely helps because sort is not a once-through filter.

  3. The simplicity of "Python to sort" processing (instead of "Python to awk to sort") prevents the exact kind of questions being asked here.

  4. Python -- while wordier than awk -- is also explicit where awk has certain implicit rules that are opaque to newbies, and confusing to non-specialists.

  5. Awk (like the shell script itself) adds Yet Another Programming language. If all of this can be done in one language (Python), eliminating the shell and the awk programming eliminates two programming languages, allowing someone to focus on the value-producing parts of the task.

Bottom line: awk can't add significant value. In this case, awk is a net cost; it added enough complexity that it was necessary to ask this question. Removing awk will be a net gain.

Sidebar Why building a pipeline (a | b) is so hard.

When the shell is confronted with a | b it has to do the following.

  1. Fork a child process of the original shell. This will eventually become b.

  2. Build an os pipe. (not a Python subprocess.PIPE) but call os.pipe() which returns two new file descriptors that are connected via common buffer. At this point the process has stdin, stdout, stderr from its parent, plus a file that will be "a's stdout" and "b's stdin".

  3. Fork a child. The child replaces its stdout with the new a's stdout. Exec the a process.

  4. The b child closes replaces its stdin with the new b's stdin. Exec the b process.

  5. The b child waits for a to complete.

  6. The parent is waiting for b to complete.

I think that the above can be used recursively to spawn a | b | c, but you have to implicitly parenthesize long pipelines, treating them as if they're a | (b | c).

Since Python has os.pipe(), os.exec() and os.fork(), and you can replace sys.stdin and sys.stdout, there's a way to do the above in pure Python. Indeed, you may be able to work out some shortcuts using os.pipe() and subprocess.Popen.

However, it's easier to delegate that operation to the shell.

Python subprocess: how to use pipes thrice?

Just add a third command following the same example:

p1 = subprocess.Popen(['convert', fileIn, 'bmp:-'], stdout=subprocess.PIPE)
p2 = subprocess.Popen(['mkbitmap', '-f', '2', '-s', '2', '-t', '0.48'],
stdin=p1.stdout, stdout=subprocess.PIPE)
p1.stdout.close()
p3 = subprocess.Popen(['potrace', '-t' , '5', '-s' , '-o', fileOut],
stdin=p2.stdout,stdout=subprocess.PIPE)
p2.stdout.close()

output = p3.communicate()[0]

How to call python 'subprocess' with pipe operator and with multiple parameters

so if you just want to input a string and then read the output of the process until the end Popen.communicate can be used:

cmd = [
'/opt/editUtility',
'--append=configuration',
'--user=userid',
'1483485'
]

proc = subprocess.Popen(
cmd,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE
)

(stdoutData, stderrData) = proc.communicate('Some data')


Related Topics



Leave a reply



Submit