Displaying subprocess output to stdout and redirecting it
To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:
#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE
with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode
To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:
stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...
where teed_call()
is define here.
Update: here's a simpler asyncio
version.
Old version:
Here's a single-threaded solution based on child_process.py
example from tulip
:
import asyncio
import sys
from asyncio.subprocess import PIPE
@asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)
# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display
# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)
The script runs '/path/to/script
command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display()
coroutine, we need an event loop:
import os
if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()
Python output to Console within Subprocess from the child scricpt
If stdout, stderr are redirected then you could try to print directly to the console:
try: # Windows
from msvcrt import putwch
def print_to_console(message):
for c in message:
putwch(c)
# newline
putwch('\r')
putwch('\n')
except ImportError: # Unix
import os
fd = os.open('/dev/tty', os.O_WRONLY | os.O_NOCTTY)
tty = os.fdopen(fd, 'w', 1)
del fd
def print_to_console(message, *, _file=tty):
print(message, file=_file)
del tty
Example:
print_to_console("Hello TTY!")
# -> Hello TTY!
Retrieving the output of subprocess.call()
Output from subprocess.call()
should only be redirected to files.
You should use subprocess.Popen()
instead. Then you can pass subprocess.PIPE
for the stderr, stdout, and/or stdin parameters and read from the pipes by using the communicate()
method:
from subprocess import Popen, PIPE
p = Popen(['program', 'arg1'], stdin=PIPE, stdout=PIPE, stderr=PIPE)
output, err = p.communicate(b"input data that is passed to subprocess' stdin")
rc = p.returncode
The reasoning is that the file-like object used by subprocess.call()
must have a real file descriptor, and thus implement the fileno()
method. Just using any file-like object won't do the trick.
See here for more info.
Output of subprocess both to PIPE and directly to stdout
This snippet has helped me once in a similar situation:
process = subprocess.Popen(cmd, bufsize=1, universal_newlines=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in iter(process.stdout.readline, ''):
print line,
sys.stdout.flush() # please see comments regarding the necessity of this line
process.wait()
errcode = process.returncode
Calling parent instance thru child in subprocess python
There are a few ways to facilitate this sort of inter-process communication. One of the more common ways is with a first-in-first-out (FIFO) named pipe.
Here's a very basic demo.
parent.py
:
#! /usr/bin/env python3
import os, tempfile, subprocess
# make a temp directory that's automatically removed when we're done
with tempfile.TemporaryDirectory() as dir:
# create a FIFO in that directory
fifo_path = os.path.join(dir, 'fifo')
os.mkfifo(fifo_path)
# start the child process
proc = subprocess.Popen(['./child.py', '--fifo', fifo_path])
print('process started')
# open the FIFO
with open(fifo_path, 'r') as fifo:
# read output from the child process
mid_output = fifo.readline()
print(f'{mid_output = }')
# wait for child to finish
code = proc.wait()
print(f'process finished')
child.py
:
#! /usr/bin/env python3
import argparse, time
# read FIFO path from command line
parser = argparse.ArgumentParser()
parser.add_argument('--fifo', required=True)
args = parser.parse_args()
# open FIFO (created by parent)
with open(args.fifo, 'w') as fifo:
# simulate some work being done
time.sleep(1)
# tell the parent that progress has been made
fifo.write('Halfway there!\n')
fifo.flush() # make sure to flush FIFOs
# Simulate some more work being done
time.sleep(1)
Then, it runs like so:
./parent.py
process started
mid_output = 'Halfway there!\n'
process finished
You can have the child script output whatever it needs to say to the parent, and you can have it do so multiple times. Just make sure that the parent knows to read from the child the same number of times that the child writes.
How to write to stdout AND to log file simultaneously with Popen?
You can use a pipe to read the data from the program's stdout and write it to all the places you want:
import sys
import subprocess
logfile = open('logfile', 'w')
proc=subprocess.Popen(['cat', 'file'], stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
for line in proc.stdout:
sys.stdout.write(line)
logfile.write(line)
proc.wait()
UPDATE
In python 3, the universal_newlines
parameter controls how pipes are used. If False
, pipe reads return bytes
objects and may need to be decoded (e.g., line.decode('utf-8')
) to get a string. If True
, python does the decode for you
Changed in version 3.3: When universal_newlines is True, the class uses the encoding locale.getpreferredencoding(False) instead of locale.getpreferredencoding(). See the io.TextIOWrapper class for more information on this change.
Related Topics
Why Does Random.Shuffle Return None
Elegant Ways to Support Equivalence ("Equality") in Python Classes
What Is the Purpose of the -M Switch
Numpy Array Is Not JSON Serializable
Is There a List of Pytz Timezones
Showing the Stack Trace from a Running Python Application
Importerror: Dll Load Failed: the Specified Module Could Not Be Found
How to Read Text from the Clipboard
Strings in a Dataframe, But Dtype Is Object
Multiprocessing Global Variable Updates Not Returned to Parent
How to Calculate the Date Six Months from the Current Date Using the Datetime Python Module
Shared-Memory Objects in Multiprocessing
Good Python Modules for Fuzzy String Comparison
Stop Reading Process Output in Python Without Hang
Convert Django Model Object to Dict with All of the Fields Intact