Pipe Subprocess Standard Output to a Variable

Pipe subprocess standard output to a variable

To get the output of ls, use stdout=subprocess.PIPE.

>>> proc = subprocess.Popen('ls', stdout=subprocess.PIPE)
>>> output = proc.stdout.read()
>>> print output
bar
baz
foo

The command cdrecord --help outputs to stderr, so you need to pipe that indstead. You should also break up the command into a list of tokens as I've done below, or the alternative is to pass the shell=True argument but this fires up a fully-blown shell which can be dangerous if you don't control the contents of the command string.

>>> proc = subprocess.Popen(['cdrecord', '--help'], stderr=subprocess.PIPE)
>>> output = proc.stderr.read()
>>> print output
Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...

If you have a command that outputs to both stdout and stderr and you want to merge them, you can do that by piping stderr to stdout and then catching stdout.

subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

As mentioned by Chris Morgan, you should be using proc.communicate() instead of proc.read().

>>> proc = subprocess.Popen(['cdrecord', '--help'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> out, err = proc.communicate()
>>> print 'stdout:', out
stdout:
>>> print 'stderr:', err
stderr:Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...

Python subprocess Popen stdout to variable only

import subprocess 
p1 = Popen(["fping", '-C10', '-B1', '-p500','-r1', '-s', '172.29.172.106'],stdout=subprocess.PIPE,stderr=subprocess.PIPE)
output,error = p1.communicate()

Try this.

Store output of subprocess.Popen call in a string

In Python 2.7 or Python 3

Instead of making a Popen object directly, you can use the subprocess.check_output() function to store output of a command in a string:

from subprocess import check_output
out = check_output(["ntpq", "-p"])

In Python 2.4-2.6

Use the communicate method.

import subprocess
p = subprocess.Popen(["ntpq", "-p"], stdout=subprocess.PIPE)
out, err = p.communicate()

out is what you want.

Important note about the other answers

Note how I passed in the command. The "ntpq -p" example brings up another matter. Since Popen does not invoke the shell, you would use a list of the command and options—["ntpq", "-p"].

Subprocess does not store output in variable

You forgot to tell Popen to capture the output/stderr; by default it just lets it go to the terminal. To fix, you just need to tell it to capture them via pipes to the parent process:

myParentDevicesProcess = subprocess.Popen(['ls','-l','/sys/class/net/e*/device/virtfn*'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)

Of course, that still won't work because glob expansion is a property of the shell, not the ls command, and you're not running your command through a shell (nor should you). You could have Python do the glob expansion for you to roughly match the shell with (putting import glob at the top of the file):

myParentDevicesProcess = subprocess.Popen(['ls','-l'] + glob.glob('/sys/class/net/e*/device/virtfn*'), stdout=subprocess.PIPE, stderr=subprocess.PIPE)

Redirect subprocess to a variable as a string

If you're using 2.7, you can use subprocess.check_output():

>>> import subprocess
>>> output = subprocess.check_output(['echo', '640x360'])
>>> print output
640x360

If not:

>>> p = subprocess.Popen(['echo', '640x360'], stdout=subprocess.PIPE)
>>> p.communicate()
('640x360\n', None)

Retrieving the output of subprocess.call()

Output from subprocess.call() should only be redirected to files.

You should use subprocess.Popen() instead. Then you can pass subprocess.PIPE for the stderr, stdout, and/or stdin parameters and read from the pipes by using the communicate() method:

from subprocess import Popen, PIPE

p = Popen(['program', 'arg1'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate(b"input data that is passed to subprocess' stdin")
rc = p.returncode

The reasoning is that the file-like object used by subprocess.call() must have a real file descriptor, and thus implement the fileno() method. Just using any file-like object won't do the trick.

See here for more info.

Displaying subprocess output to stdout and redirecting it

To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:

#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE

with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode

To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:

stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...

where teed_call() is define here.


Update: here's a simpler asyncio version.


Old version:

Here's a single-threaded solution based on child_process.py example from tulip:

import asyncio
import sys
from asyncio.subprocess import PIPE

@asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)

# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display

# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)

The script runs '/path/to/script command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display() coroutine, we need an event loop:

import os

if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()


Related Topics



Leave a reply



Submit