Python Popen Command. Wait Until the Command Is Finished

Python popen command. Wait until the command is finished

Depending on how you want to work your script you have two options. If you want the commands to block and not do anything while it is executing, you can just use subprocess.call.

#start and block until done
subprocess.call([data["om_points"], ">", diz['d']+"/points.xml"])

If you want to do things while it is executing or feed things into stdin, you can use communicate after the popen call.

#start and process things, then wait
p = subprocess.Popen([data["om_points"], ">", diz['d']+"/points.xml"])
print "Happens while running"
p.communicate() #now wait plus that you can send commands to process

As stated in the documentation, wait can deadlock, so communicate is advisable.

Python popen shell command wait till subprocess has finished

Use subprocess.popen:

This module intends to replace several older modules and functions.

So in your case import subprocess
and then use popen.communicate() to wait until your command finished.

For the documentation on this see: here

So:

from subprocess import Popen

def send_to_imagemagick(self, shell_command):

try:
# log.info('Shell command = {0}'.format(shell_command))
description=Popen(shell_command)
description.communicate()

# log.info('description = {0}'.format(description))
except Exception as e:
log.info('Error with Img cmd tool {0}'.format(e))

while True:
line = description.readline()
if not line: break
return line

Wait for the first subprocess to finish

Here's a solution using psutil - which is aimed exactly at this use-case:

import subprocess
import psutil

a = subprocess.Popen(['/bin/sleep', "2"])

b = subprocess.Popen(['/bin/sleep', "4"])

procs_list = [psutil.Process(a.pid), psutil.Process(b.pid)]

def on_terminate(proc):
print("process {} terminated".format(proc))

# waits for multiple processes to terminate
gone, alive = psutil.wait_procs(procs_list, timeout=3, callback=on_terminate)

Or, if you'd like to have a loop waiting for one of the process to be done:

while True: 
gone, alive = psutil.wait_procs(procs_list, timeout=3, callback=on_terminate)
if len(gone)>0:
break

Python subprocess.Popen() wait for completion

Use Popen.wait:

process = subprocess.Popen(["your_cmd"]...)
process.wait()

Or check_output, check_call which all wait for the return code depending on what you want to do and the version of python.

If you are using python >= 2.7 and you don't care about the output just use check_call.

You can also use call but that will not raise any error if you have a non-zero return code which may or may not be desirable

wait process until all subprocess finish?

A Popen object has a .wait() method exactly defined for this: to wait for the completion of a given subprocess (and, besides, for retuning its exit status).

If you use this method, you'll prevent that the process zombies are lying around for too long.

(Alternatively, you can use subprocess.call() or subprocess.check_call() for calling and waiting. If you don't need IO with the process, that might be enough. But probably this is not an option, because your if the two subprocesses seem to be supposed to run in parallel, which they won't with (call()/check_call().)

If you have several subprocesses to wait for, you can do

exit_codes = [p.wait() for p in p1, p2]

(or maybe exit_codes = [p.wait() for p in (p1, p2)] for syntactical reasons)

which returns as soon as all subprocesses have finished. You then have a list of return codes which you maybe can evaluate.

Wait subprocess.run until completes its task

According to the python documentation subprocess.run waits for the process to end.

The problem is that ffmpeg overwrites the input file if the input and output files are the same and therefore the output video becomes unusable.

Making python wait until subprocess.call has finished its command

What I've found out: Thanks to QuantumChris. I've found out that robocopy returns from the terminal and back into my script although I've used subprocess.run which should have paused my script until it had finished running. I'm stalling the second robocopy from running by checking if the files have been copied over to the destination folder before progressing with the second robocopy. The issue is that if the last file is big then os.path.isfile() detects the file WHILE it is still being copied over. So it engages the second robocopy however the second robocopy doesn't detect that last file and so attempts to copy the file over but recognises that it can't access the file as it is already in use (by the first robocopy) so it waits 30 secs before trying again. After the 30 secs it is able to access the file and copies it over. What I would like to do now is to make my last file an empty dummy file which I don't care about it being copied twice as it is empty. Robocopy seems to copy over the files according to ASCII order. So I've named the file ~~~~~.txt :D

Python subprocess() wait until the last command in a series command is done

Of course! Just store your popen objects, then you can check them all for completion before moving on:

# create an empty list to add all of the popen objects to
processes = []

for chromosome in chrom:
p1 = subprocess.Popen('grep \'{}\' {} > {}.{}.temp'.format(chromosome, bed1, sub1, chromosome),shell=True)
p2 = subprocess.Popen('grep \'{}\' {} > {}.{}.temp'.format(chromosome, bed2, sub2, chromosome),shell=True)

# stash the popen objects for later use
processes.append(p1)
processes.append(p2)

# before moving on, call wait() on all of the objects to ensure they're done
# this is a blocking call, so the loop won't complete until all processes have returned
for p in processes:
p.wait()

# now do your post processing work


Related Topics



Leave a reply



Submit