Getting realtime output using subprocess
I tried this, and for some reason while the code
for line in p.stdout:
...
buffers aggressively, the variant
while True:
line = p.stdout.readline()
if not line: break
...
does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)
real time subprocess.Popen via stdout and PIPE
Your interpreter is buffering. Add a call to sys.stdout.flush() after your print statement.
Displaying subprocess output to stdout and redirecting it
To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:
#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE
with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode
To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:
stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...
where teed_call()
is define here.
Update: here's a simpler asyncio
version.
Old version:
Here's a single-threaded solution based on child_process.py
example from tulip
:
import asyncio
import sys
from asyncio.subprocess import PIPE
@asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)
# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display
# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)
The script runs '/path/to/script
command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display()
coroutine, we need an event loop:
import os
if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()
Real time output of subprocess.popen() and not line by line
Could be two things...
It's likely that readline is changing some things from the output of your
program. I believe \r
is carriage return and tells the terminal
to return to the beginning of the
line and then the program can output overtop the text it just output.
Readline is most likely removing this.
First thing to try,
p = subprocess.Popen(args, stdout=subprocess.PIPE, \
stderr=subprocess.PIPE, \
universal_newlines=True)
for line in iter(p.stdout.readline, ""):
sys.stdout.write('\r'+line[:-1])
sys.stdout.flush()
You have to do the flush because stdout buffers until it gets a \n
and of
course you're not writing one.
Constantly print Subprocess output while process is running
You can use iter to process lines as soon as the command outputs them: lines = iter(fd.readline, "")
. Here's a full example showing a typical use case (thanks to @jfs for helping out):
from __future__ import print_function # Only Python 2.x
import subprocess
def execute(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
return_code = popen.wait()
if return_code:
raise subprocess.CalledProcessError(return_code, cmd)
# Example
for path in execute(["locate", "a"]):
print(path, end="")
How can I read all availably data from subprocess.Popen.stdout (non blocking)?
Poking around I found this really nice solution
Persistent python subprocess
which avoids the blocking issue all together by using fcntl
to set file attributes on the subprocess pipes to non-blocking mode, no auxiliary threads or polling required. I could be missing something but it has solved my interactive process control problem.
Related Topics
Why Do Attribute References Act Like This with Python Inheritance
Python Regular Expression Re.Match, Why This Code Does Not Work
How to Break Up This Long Line in Python
Why Does Python's _Import_ Require Fromlist
Python Urllib2 with Keep Alive
How to Remove the First Item from a List
How to Call Function That Takes an Argument in a Django Template
Python 2 CSV Writer Produces Wrong Line Terminator on Windows
Differencebetween I = I + 1 and I += 1 in a 'For' Loop
Fill Between Two Vertical Lines in Matplotlib
Access Memory Address in Python
Why Is Exponentiation Applied Right to Left
Reduce Left and Right Margins in Matplotlib Plot