How can I read all availably data from subprocess.Popen.stdout (non blocking)?
Poking around I found this really nice solution
Persistent python subprocess
which avoids the blocking issue all together by using fcntl
to set file attributes on the subprocess pipes to non-blocking mode, no auxiliary threads or polling required. I could be missing something but it has solved my interactive process control problem.
A non-blocking read on a subprocess.PIPE in Python
fcntl
, select
, asyncproc
won't help in this case.
A reliable way to read a stream without blocking regardless of operating system is to use Queue.get_nowait()
:
import sys
from subprocess import PIPE, Popen
from threading import Thread
try:
from queue import Queue, Empty
except ImportError:
from Queue import Queue, Empty # python 2.x
ON_POSIX = 'posix' in sys.builtin_module_names
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
p = Popen(['myprogram.exe'], stdout=PIPE, bufsize=1, close_fds=ON_POSIX)
q = Queue()
t = Thread(target=enqueue_output, args=(p.stdout, q))
t.daemon = True # thread dies with the program
t.start()
# ... do other things here
# read line without blocking
try: line = q.get_nowait() # or q.get(timeout=.1)
except Empty:
print('no output yet')
else: # got line
# ... do something with line
Constantly print Subprocess output while process is running
You can use iter to process lines as soon as the command outputs them: lines = iter(fd.readline, "")
. Here's a full example showing a typical use case (thanks to @jfs for helping out):
from __future__ import print_function # Only Python 2.x
import subprocess
def execute(cmd):
popen = subprocess.Popen(cmd, stdout=subprocess.PIPE, universal_newlines=True)
for stdout_line in iter(popen.stdout.readline, ""):
yield stdout_line
popen.stdout.close()
return_code = popen.wait()
if return_code:
raise subprocess.CalledProcessError(return_code, cmd)
# Example
for path in execute(["locate", "a"]):
print(path, end="")
Python: How to read stdout of subprocess in a nonblocking way
Check select module
import subprocess
import select
import time
x=subprocess.Popen(['/bin/bash','-c',"while true; do sleep 5; echo yes; done"],stdout=subprocess.PIPE)
y=select.poll()
y.register(x.stdout,select.POLLIN)
while True:
if y.poll(1):
print x.stdout.readline()
else:
print "nothing here"
time.sleep(1)
EDIT:
Threaded Solution for non posix systems:
import subprocess
from threading import Thread
import time
linebuffer=[]
x=subprocess.Popen(['/bin/bash','-c',"while true; do sleep 5; echo yes; done"],stdout=subprocess.PIPE)
def reader(f,buffer):
while True:
line=f.readline()
if line:
buffer.append(line)
else:
break
t=Thread(target=reader,args=(x.stdout,linebuffer))
t.daemon=True
t.start()
while True:
if linebuffer:
print linebuffer.pop(0)
else:
print "nothing here"
time.sleep(1)
Reliable non blocking reads from subprocess stdout
readline
reads a single line from the file-like object PIPE
, to read it all of it, simply wrap it in a while loop. You should also call sleep
after each read to save on CPU cycles.
Here is a simple example:
import subprocess
p = subprocess.Popen(
['ls', '-lat'],
shell=False,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
stdin=subprocess.PIPE
)
while True:
line = p.stdout.readline()
if line == '':
break
print(line.strip()) # remove extra ws between lines
EDIT:
woah, sorry, I completely missed the part you were trying to read input in that other process...
So, fn your other process, looks something like:
print('Hello')
in = raw_input()
Then the print actually sends the content to the file-like object PIPE
you passed earlier which has it's own buffering mechanism. This behavior is explained in the print()
function docs
To solve this simply add a sys.stdout.flush()
between your print
and raw_input
:
print('Hello')
sys.stdout.flush() # "flush" the output to our PIPE
in = raw_input()
Get all output from subprocess in python
As I couldn't find a direct way to solve this problem, with help of this reference, the output can be redirected to a text file and then read it back.
import subprocess
import os
import tempfile
def execute_to_file(command):
"""
This function execute the command
and pass its output to a tempfile then read it back
It is usefull for process that deploy child process
"""
temp_file = tempfile.NamedTemporaryFile(delete=False)
temp_file.close()
path = temp_file.name
command = command + " > " + path
proc = subprocess.run(command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
if proc.stderr:
# if command failed return
os.unlink(path)
return
with open(path, 'r') as f:
data = f.read()
os.unlink(path)
return data
if __name__ == "__main__":
path = "Somepath"
command = 'ecls.exe /files ' + path
print(execute(command))
Getting realtime output using subprocess
I tried this, and for some reason while the code
for line in p.stdout:
...
buffers aggressively, the variant
while True:
line = p.stdout.readline()
if not line: break
...
does not. Apparently this is a known bug: http://bugs.python.org/issue3907 (The issue is now "Closed" as of Aug 29, 2018)
read subprocess stdout line by line
I think the problem is with the statement for line in proc.stdout
, which reads the entire input before iterating over it. The solution is to use readline()
instead:
#filters output
import subprocess
proc = subprocess.Popen(['python','fake_utility.py'],stdout=subprocess.PIPE)
while True:
line = proc.stdout.readline()
if not line:
break
#the real code does filtering here
print "test:", line.rstrip()
Of course you still have to deal with the subprocess' buffering.
Note: according to the documentation the solution with an iterator should be equivalent to using readline()
, except for the read-ahead buffer, but (or exactly because of this) the proposed change did produce different results for me (Python 2.5 on Windows XP).
Python read from subprocess stdout and stderr separately while preserving order
Here's a solution based on selectors
, but one that preserves order, and streams variable-length characters (even single chars).
The trick is to use read1()
, instead of read()
.
import selectors
import subprocess
import sys
p = subprocess.Popen(
["python", "random_out.py"], stdout=subprocess.PIPE, stderr=subprocess.PIPE
)
sel = selectors.DefaultSelector()
sel.register(p.stdout, selectors.EVENT_READ)
sel.register(p.stderr, selectors.EVENT_READ)
while True:
for key, _ in sel.select():
data = key.fileobj.read1().decode()
if not data:
exit()
if key.fileobj is p.stdout:
print(data, end="")
else:
print(data, end="", file=sys.stderr)
If you want a test program, use this.
import sys
from time import sleep
for i in range(10):
print(f" x{i} ", file=sys.stderr, end="")
sleep(0.1)
print(f" y{i} ", end="")
sleep(0.1)
Related Topics
Force Python to Use an Older Version of Module (Than What I Have Installed Now)
Splitting Out the Output of Ps Using Python
How to Iterate Through Two Lists in Parallel
How to Import a Module Given the Full Path
How to Make a Sprite Move When Key Is Held Down
Why Isn't the 'Global' Keyword Needed to Access a Global Variable
Why Dict.Get(Key) Instead of Dict[Key]
Why Does Python Use 'Else' After For and While Loops
Cartesian Product of X and Y Array Points into Single Array of 2D Points
Why Does Concatenation of Dataframes Get Exponentially Slower
Simulate Keystroke in Linux With Python
Force Another Program'S Standard Output to Be Unbuffered Using Python
Why Is Using 'Eval' a Bad Practice
How to Make a Dictionary from Separate Lists of Keys and Values
Adding a Scrollbar to a Group of Widgets in Tkinter