How to Redirect Output with Subprocess in Python

How to redirect output with subprocess in Python?

UPDATE: os.system is discouraged, albeit still available in Python 3.


Use os.system:

os.system(my_cmd)

If you really want to use subprocess, here's the solution (mostly lifted from the documentation for subprocess):

p = subprocess.Popen(my_cmd, shell=True)
os.waitpid(p.pid, 0)

OTOH, you can avoid system calls entirely:

import shutil

with open('myfile', 'w') as outfile:
for infile in ('file1', 'file2', 'file3'):
shutil.copyfileobj(open(infile), outfile)

How do I pipe a subprocess call to a text file?

If you want to write the output to a file you can use the stdout-argument of subprocess.call.

It takes either

  • None (the default, stdout is inherited from the parent (your script))
  • subprocess.PIPE (allows you to pipe from one command/process to another)
  • a file object or a file descriptor (what you want, to have the output written to a file)

You need to open a file with something like open and pass the object or file descriptor integer to call:

f = open("blah.txt", "w")
subprocess.call(["/home/myuser/run.sh", "/tmp/ad_xml", "/tmp/video_xml"], stdout=f)

I'm guessing any valid file-like object would work, like a socket (gasp :)), but I've never tried.

As marcog mentions in the comments you might want to redirect stderr as well, you can redirect this to the same location as stdout with stderr=subprocess.STDOUT. Any of the above mentioned values works as well, you can redirect to different places.

Output redirection to file using subprocess.Popen

I/O redirection with < and > is done by the shell. When you call subprocess.Popen() with a list as the first argument or without shell=True, the program is executed directly, not using the shell to parse the command line. So you're executing the program and passing literal arguments < and > to it. It's as if you executed the shell command and quoted the < and > characters:

scriptname '<' infile.txt '>' outfile.txt

If you want to use the shell you have to send a single string (just like using os.system().

data = subprocess.Popen(" ".join([ shlex.quote(script.out), "<", shlex.quote(input_file[i]), ">", shlex.quote(output_file[i])]), shell=True)

Use shlex.quote() to escape arguments that shouldn't be treated as shell metacharacters.

python script using subprocess, redirect ALL output to file

There are several problems:

  • you should redirect both stdout and stderr
  • you should use unbuffered files if you want to mix normal print and the output of launched commands.

Something like this:

import sys, subprocess

# Note the 0 here (unbuffered file)
sys.stdout = open("mylog","w",0)

print "Hello"
print "-----"

subprocess.call(["./prog"],stdout=sys.stdout, stderr=sys.stdout)
print "-----"
subprocess.call(["./prog"],stdout=sys.stdout, stderr=sys.stdout)
print "-----"

print "End"

Pipe subprocess standard output to a variable

To get the output of ls, use stdout=subprocess.PIPE.

>>> proc = subprocess.Popen('ls', stdout=subprocess.PIPE)
>>> output = proc.stdout.read()
>>> print output
bar
baz
foo

The command cdrecord --help outputs to stderr, so you need to pipe that indstead. You should also break up the command into a list of tokens as I've done below, or the alternative is to pass the shell=True argument but this fires up a fully-blown shell which can be dangerous if you don't control the contents of the command string.

>>> proc = subprocess.Popen(['cdrecord', '--help'], stderr=subprocess.PIPE)
>>> output = proc.stderr.read()
>>> print output
Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...

If you have a command that outputs to both stdout and stderr and you want to merge them, you can do that by piping stderr to stdout and then catching stdout.

subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

As mentioned by Chris Morgan, you should be using proc.communicate() instead of proc.read().

>>> proc = subprocess.Popen(['cdrecord', '--help'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
>>> out, err = proc.communicate()
>>> print 'stdout:', out
stdout:
>>> print 'stderr:', err
stderr:Usage: wodim [options] track1...trackn
Options:
-version print version information and exit
dev=target SCSI target to use as CD/DVD-Recorder
gracetime=# set the grace time before starting to write to #.
...

Redirect subprocess.run output to file

Open the file in write (>) or append (>>) mode, and assign the descriptor associated with it to stdout in subprocess.run call.

with open('outputfile.txt', 'w') as fd:
subprocess.run(['grep', '-o', searchFor, filename], stdout=fd)

Python: How do I redirect this output?

sys.stdout is the python's idea of the parent's output stream.

In any case you want to change the child's output stream.

subprocess.call and subprocess.Popen take named parameters for the output streams.

So open the file you want to output to and then pass that as the appropriate argument to subprocess.

f = open("outputFile","wb")
subprocess.call(argsArray,stdout=f)

Your talk of using >> suggest you are using shell=True, or think you are passing your arguments to the shell. In any case it is better to use the array form of subprocess, which avoid an unnecessary process, and any weirdness from the shell.

EDIT:

So I downloaded RTMPDump and tried it out, it would appear the messages are appearing on stderr.

So with the following program, nothing appears on the programs output, and the rtmpdump logs when into the stderr.txt file:

#!/usr/bin/env python

import os
import subprocess

RTMPDUMP="./rtmpdump"
assert os.path.isfile(RTMPDUMP)
command = [RTMPDUMP,'-r','rtmp://oxy.videolectures.net/video/',
'-y','2007/pascal/bootcamp07_vilanova/keller_mikaela/bootcamp07_keller_bss_01',
'-a','video','-s',
'http://media.videolectures.net/jw-player/player.swf',
'-w','ffa4f0c469cfbe1f449ec42462e8c3ba16600f5a4b311980bb626893ca81f388'
,'-x','53910','-o','test.flv']

stdout = open("stdout.txt","wb")
stderr = open("stderr.txt","wb")
subprocess.call(command,stdout=stdout,stderr=stderr)

Displaying subprocess output to stdout and redirecting it

To save subprocess' stdout to a variable for further processing and to display it while the child process is running as it arrives:

#!/usr/bin/env python3
from io import StringIO
from subprocess import Popen, PIPE

with Popen('/path/to/script', stdout=PIPE, bufsize=1,
universal_newlines=True) as p, StringIO() as buf:
for line in p.stdout:
print(line, end='')
buf.write(line)
output = buf.getvalue()
rc = p.returncode

To save both subprocess's stdout and stderr is more complex because you should consume both streams concurrently to avoid a deadlock:

stdout_buf, stderr_buf = StringIO(), StringIO()
rc = teed_call('/path/to/script', stdout=stdout_buf, stderr=stderr_buf,
universal_newlines=True)
output = stdout_buf.getvalue()
...

where teed_call() is define here.


Update: here's a simpler asyncio version.


Old version:

Here's a single-threaded solution based on child_process.py example from tulip:

import asyncio
import sys
from asyncio.subprocess import PIPE

@asyncio.coroutine
def read_and_display(*cmd):
"""Read cmd's stdout, stderr while displaying them as they arrive."""
# start process
process = yield from asyncio.create_subprocess_exec(*cmd,
stdout=PIPE, stderr=PIPE)

# read child's stdout/stderr concurrently
stdout, stderr = [], [] # stderr, stdout buffers
tasks = {
asyncio.Task(process.stdout.readline()): (
stdout, process.stdout, sys.stdout.buffer),
asyncio.Task(process.stderr.readline()): (
stderr, process.stderr, sys.stderr.buffer)}
while tasks:
done, pending = yield from asyncio.wait(tasks,
return_when=asyncio.FIRST_COMPLETED)
assert done
for future in done:
buf, stream, display = tasks.pop(future)
line = future.result()
if line: # not EOF
buf.append(line) # save for later
display.write(line) # display in terminal
# schedule to read the next line
tasks[asyncio.Task(stream.readline())] = buf, stream, display

# wait for the process to exit
rc = yield from process.wait()
return rc, b''.join(stdout), b''.join(stderr)

The script runs '/path/to/script command and reads line by line both its stdout&stderr concurrently. The lines are printed to parent's stdout/stderr correspondingly and saved as bytestrings for future processing. To run the read_and_display() coroutine, we need an event loop:

import os

if os.name == 'nt':
loop = asyncio.ProactorEventLoop() # for subprocess' pipes on Windows
asyncio.set_event_loop(loop)
else:
loop = asyncio.get_event_loop()
try:
rc, *output = loop.run_until_complete(read_and_display("/path/to/script"))
if rc:
sys.exit("child failed with '{}' exit code".format(rc))
finally:
loop.close()

How do I redirect stdout to a file when using subprocess.call in python?

Pass a file as the stdout parameter to subprocess.call:

with open('out-file.txt', 'w') as f:
subprocess.call(['program'], stdout=f)


Related Topics



Leave a reply



Submit