Calling a Python Script with Input Within a Python Script Using Subprocess

Calling a python script with input within a python script using subprocess

To call a Python script from another one using subprocess module and to pass it some input and to get its output:

#!/usr/bin/env python3
import os
import sys
from subprocess import check_output

script_path = os.path.join(get_script_dir(), 'a.py')
output = check_output([sys.executable, script_path],
input='\n'.join(['query 1', 'query 2']),
universal_newlines=True)

where get_script_dir() function is defined here.

A more flexible alternative is to import module a and to call a function, to get the result (make sure a.py uses if __name__=="__main__" guard, to avoid running undesirable code on import):

#!/usr/bin/env python
import a # the dir with a.py should be in sys.path

result = [a.search(query) for query in ['query 1', 'query 2']]

You could use mutliprocessing to run each query in a separate process (if performing a query is CPU-intensive then it might improve time performance):

#!/usr/bin/env python
from multiprocessing import freeze_support, Pool
import a

if __name__ == "__main__":
freeze_support()
pool = Pool() # use all available CPUs
result = pool.map(a.search, ['query 1', 'query 2'])

How to execute a Python script from another script with subprocess with full I/O?

The problem is that you are redirecting the stdout output of to_run.py to a pipe, yet the input function in to_run.py needs to to write its prompt to stdout and that seems to be causing the problem. The following code demonstrates this and gets around the problem by providing the input for the input function by using a pipe for stdin. The communicate method is used for sending the input. I have also specified universal_newlines=True so that the output data are strings that do not need to be decoded.

import subprocess

p = subprocess.Popen(['./to_run.py'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
stdout, stderr = p.communicate('Line 1\nLine 2\n')
print(stdout, end='')

Update

You can do it without using communicate by writing and reading directly to the pipes, but the docs have the following warning:

Warning Use communicate() rather than .stdin.write, .stdout.read or .stderr.read to avoid deadlocks due to any of the
other OS pipe buffers filling up and blocking the child process.

As far as being "interactive", my experience doing this is that you have to write all your stdin data up front, so it is not what I would consider particularly interactive. But at least you can process the output as it is produced:

import subprocess

p = subprocess.Popen(['./to_run.py'], stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True)
p.stdin.write('Line 1\n')
p.stdin.write('Line 2\n')
p.stdin.close()
for out in iter(p.stdout.readline, ''): # read rest of output
print(out, end='')
p.stdout.close()
return_code = p.wait()

Update 2

In principal you can be completely interactive. But the problem with your particular to_run.py is that function input writes its prompt without a terminating newline and so calling p.stdout.readline() in run.py is hanging waiting for a newline character. If we modify the to_run.py as follows, then everything works as expected:

to_run.py

#!/usr/bin/python3

import sys

print("HELLOO", flush=True)

print('Enter first line:', flush=True)
l1 = sys.stdin.readline()
print('Enter second line:', flush=True)
l2 = sys.stdin.readline()

print('First line:', l1, end='')
print('Second line:', l2, end='')

run.py

import subprocess

cmd = ['./to_run.py']
p = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True)
print(p.stdout.readline(), end='') # HELLOO
print(p.stdout.readline(), end='') # Enter first line
p.stdin.write('Line 1\n')
p.stdin.flush()
print(p.stdout.readline(), end='') # Enter second line
p.stdin.write('Line 2\n')
p.stdin.flush()
p.stdin.close()
for out in iter(p.stdout.readline, ''):
print(out, end='')
p.stdout.close()
return_code = p.wait()

The following code using a separate thread to read from the stdout pipe seems to be what is required to handle the situation when you can't use readline, as is the case when the input function is being used. Here the thread reads characters from stdout one character at a time and writes to a Queue instance. The main thread has specialized routines that read from the queue to emulate readline and a special read_prompt that looks for the expected prompt:

import subprocess
from threading import Thread
from queue import Queue

stdout = Queue()

def rdr_thread(pipe):
line = ''
while True:
buf = pipe.read(1)
if not buf:
stdout.put(None) # show end of file
return
stdout.put(buf[0])

def read_line():
line = ''
while True:
ch = stdout.get()
line += ch
if ch == '\n':
return line

def read_prompt():
line = ''
while True:
ch = stdout.get()
line += ch
if line[-2:] == ': ':
return line

def output_rest():
while True:
ch = stdout.get()
if ch is None:
return
print(ch, end='')


cmd = ['./to_run.py']
p = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True)
t = Thread(target=rdr_thread, args=(p.stdout,))
t.start()
print(read_line(), end='') # HELLOO
print(read_prompt()) # 'Enter first line: '
p.stdin.write('Line 1\n')
p.stdin.flush()
print(read_prompt()) # 'Enter second line: '
p.stdin.write('Line 2\n')
p.stdin.flush()
p.stdin.close()
output_rest()
p.stdout.close()
return_code = p.wait()
t.join()

Generic Prompt Handling Using Timeout

import subprocess
from threading import Thread
from queue import Queue, Empty

stdout = Queue()

eof = False

def rdr_thread(pipe):
line = ''
while True:
buf = pipe.read(1)
if not buf:
stdout.put(None) # show end of file
eof = True
return
stdout.put(buf[0])


def read_prompt():
"""
read until there seems to be temporarilly no more output
"""
if eof:
return ''
line = ''
try:
while True:
ch = stdout.get(timeout=.5)
if ch is None:
break
line += ch
except Empty:
pass
return line



cmd = ['./to_run.py']
p = subprocess.Popen(cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, universal_newlines=True)
t = Thread(target=rdr_thread, args=(p.stdout,))
t.start()
print(read_prompt(), end='')
p.stdin.write('Line 1\n')
p.stdin.flush()
print(read_prompt(), end='')
p.stdin.write('Line 2\n')
p.stdin.flush()
p.stdin.close()
for chunk in iter(read_prompt, ''):
print(chunk, end='')
p.stdout.close()
return_code = p.wait()
t.join()

Launch a python script from another script, with parameters in subprocess argument

The subprocess library is interpreting all of your arguments, including demo_oled_v01.py as a single argument to python. That's why python is complaining that it cannot locate a file with that name. Try running it as:

p = subprocess.Popen(['python', 'demo_oled_v01.py', '--display',
'ssd1351', '--width', '128', '--height', '128', '--interface', 'spi',
'--gpio-data-command', '20'])

See more information on Popen here.

Run Python script within Python by using `subprocess.Popen` in real time

Last night I've set out to do this using a pipe:

import os
import subprocess

with open("test2", "w") as f:
f.write("""import time
print('start')
time.sleep(2)
print('done')""")

(readend, writeend) = os.pipe()

p = subprocess.Popen(['python3', '-u', 'test2'], stdout=writeend, bufsize=0)
still_open = True
output = ""
output_buf = os.read(readend, 1).decode()
while output_buf:
print(output_buf, end="")
output += output_buf
if still_open and p.poll() is not None:
os.close(writeend)
still_open = False
output_buf = os.read(readend, 1).decode()

Forcing buffering out of the picture and reading one character at the time (to make sure we do not block writes from the process having filled a buffer), closing the writing end when process finishes to make sure read catches the EOF correctly. Having looked at the subprocess though that turned out to be a bit of an overkill. With PIPE you get most of that for free and I ended with this which seems to work fine (call read as many times as necessary to keep emptying the pipe) with just this and assuming the process finished, you do not have to worry about polling it and/or making sure the write end of the pipe is closed to correctly detect EOF and get out of the loop:

p = subprocess.Popen(['python3', '-u', 'test2'],
stdout=subprocess.PIPE, bufsize=1,
universal_newlines=True)
output = ""
output_buf = p.stdout.readline()
while output_buf:
print(output_buf, end="")
output += output_buf
output_buf = p.stdout.readline()

This is a bit less "real-time" as it is basically line buffered.

Note: I've added -u to you Python call, as you need to also make sure your called process' buffering does not get in the way.

Python; how to properly call another python script as a subprocess

Every time a program is started, it receives a list of arguments it was invoked with. This is often called argv (v stands for vector, i.e. one-dimensional array). The program parses this list, extracts options, parameters, filenames, etc. depending on its own invocation syntax.

When working at the command line, the shell takes care of parsing the input line, starting new program or programs and passing them their argument list.

When a program is called from another program, the caller is responsible to provide the correct arguments. It could delegate this work to shell. The price for it is very high. There is substantial overhead and possibly a security risk! Avoid this approach whenever possible.

Finally to the question itself:

shpfiles = 'shapefile_a.shp shapefile_b.shp'
subprocess.call(['python', 'shapemerger.py', '%s' % shpfiles])

This will invoke python to run the script shapemerger.py with one argument shapefile_a.shp shapefile_b.shp. The script expects filenames and receives this one name. The file "shapefile_a.shp shapefile_b.shp" does not exist, but the script probably stops before attempting to access that file, because it expect 2 or more files to process.

The correct way is to pass every filename as one argument. Assuming shpfiles is a whitespace separated list:

subprocess.call(['python', 'shapemerger.py'] + shpfiles.split())

will generate a list with 4 items. It is important to understand that this approach will fail if there is a space in a filename.

Run a Python script from another Python script, passing in arguments

Try using os.system:

os.system("script2.py 1")

execfile is different because it is designed to run a sequence of Python statements in the current execution context. That's why sys.argv didn't change for you.

Calling a python script with arguments using subprocess

When you pass parameters to a new process they are passed positionally, the names from the parent process do not survive, only the values. You need to add:

import sys
def main():
if len(sys.argv) == 6:
project, profile, reader, file, loop = sys.argv[1:]
else:
raise ValueError,("incorrect number of arguments")

p = loading(project, profile, reader, file, loop)
p.csv_generation()

We are testing the length of sys.argv before the assignment (the first element is the name of the program).

Using subprocess to run Python script on Windows

Just found sys.executable - the full path to the current Python executable, which can be used to run the script (instead of relying on the shbang, which obviously doesn't work on Windows)

import sys
import subprocess

theproc = subprocess.Popen([sys.executable, "myscript.py"])
theproc.communicate()


Related Topics



Leave a reply



Submit