Execute a Function After Flask Returns Response

Execute a function after Flask returns response

The long story short is that Flask does not provide any special capabilities to accomplish this. For simple one-off tasks, consider Python's multithreading as shown below. For more complex configurations, use a task queue like RQ or Celery.

Why?

It's important to understand the functions Flask provides and why they do not accomplish the intended goal. All of these are useful in other cases and are good reading, but don't help with background tasks.

Flask's after_request handler

Flask's after_request handler, as detailed in this pattern for deferred request callbacks and this snippet on attaching different functions per request, will pass the request to the callback function. The intended use case is to modify the request, such as to attach a cookie.

Thus the request will wait around for these handlers to finish executing because the expectation is that the request itself will change as a result.

Flask's teardown_request handler

This is similar to after_request, but teardown_request doesn't receive the request object. So that means it won't wait for the request, right?

This seems like the solution, as this answer to a similar Stack Overflow question suggests. And since Flask's documentation explains that teardown callbacks are independent of the actual request and do not receive the request context, you'd have good reason to believe this.

Unfortunately, teardown_request is still synchronous, it just happens at a later part of Flask's request handling when the request is no longer modifiable. Flask will still wait for teardown functions to complete before returning the response, as this list of Flask callbacks and errors dictates.

Flask's streaming responses

Flask can stream responses by passing a generator to Response(), as this Stack Overflow answer to a similar question suggests.

With streaming, the client does begin receiving the response before the request concludes. However, the request still runs synchronously, so the worker handling the request is busy until the stream is finished.

This Flask pattern for streaming includes some documentation on using stream_with_context(), which is necessary to include the request context.

So what's the solution?

Flask doesn't offer a solution to run functions in the background because this isn't Flask's responsibility.

In most cases, the best way to solve this problem is to use a task queue such as RQ or Celery. These manage tricky things like configuration, scheduling, and distributing workers for you.This is the most common answer to this type of question because it is the most correct, and forces you to set things up in a way where you consider context, etc. correctly.

If you need to run a function in the background and don't want to set up a queue to manage this, you can use Python's built in threading or multiprocessing to spawn a background worker.

You can't access request or others of Flask's thread locals from background tasks, since the request will not be active there. Instead, pass the data you need from the view to the background thread when you create it.

@app.route('/start_task')
def start_task():
def do_work(value):
# do something that takes a long time
import time
time.sleep(value)

thread = Thread(target=do_work, kwargs={'value': request.args.get('value', 20)})
thread.start()
return 'started'

Is it possible to execute a function after return statement in flask?

flask-executor provides a lightweight way to do this.

In your case, it might look like this:

from flask import Flask, render_template, request, url_for, redirect, session

app = Flask(__name__)
executor = Executor(app)

@app.route('/', methods=["POST", "GET"])
def index():
executor.submit(my_function)
return render_template('index.html')

my_function will be executed asynchronously, and index will continue to run immediately, letting you respond to the request by rendering the template without waiting for my_function to finish running.

If you need more guarantees, like scalability (tasks could execute on other computers) or reliability (tasks still execute if the webserver dies), you could look into something like Celery or RQ. Those tools are powerful, but generally require much more configuration then the flask-executor example above.

How to excecute code after Flask `app.run()` statement (run a Flask app and a function in parallel, execute code while Flask server is running)

You can do what you want by using multithreading:

from flask import Flask
import threading
import time

app = Flask(__name__)

@app.route("/")
def hello_world():
return "Hello, World!"

def run_app():
app.run(debug=False, threaded=True)

def while_function():
i = 0
while i < 20:
time.sleep(1)
print(i)
i += 1

if __name__ == "__main__":
first_thread = threading.Thread(target=run_app)
second_thread = threading.Thread(target=while_function)
first_thread.start()
second_thread.start()

Output:

 * Serving Flask app "app"
* Environment: production
* Debug mode: off
* Running on [...] (Press CTRL+C to quit)
0
1
2
3
4
5
6
7
8
[...]

The idea is simple:

  • create 2 functions, one to run the app and an other to execute the wile loop,
  • and then execute each function in a seperate thread, making them run in parallel

You can do this with multiprocessing instead of multithreading too:

The (main) differences here is that the functions will run on different CPUs and in memory spaces.

from flask import Flask
from multiprocessing import Process
import time

# Helper function to easly parallelize multiple functions
def parallelize_functions(*functions):
processes = []
for function in functions:
p = Process(target=function)
p.start()
processes.append(p)
for p in processes:
p.join()

# The function that will run in parallel with the Flask app
def while_function():
i = 0
while i < 20:
time.sleep(1)
print(i)
i += 1

app = Flask(__name__)

@app.route("/")
def hello_world():
return "Hello, World!"

def run_app():
app.run(debug=False)

if __name__ == '__main__':
parallelize_functions(while_function, run_app)

If you want to use before_first_request proposed by @Triet Doan: you will have to pass the while function as an argument of before_first_request like this:

from flask import Flask
import time

app = Flask(__name__)

def while_function(arg):
i = 0
while i < 5:
time.sleep(1)
print(i)
i += 1

@app.before_first_request(while_function)
@app.route("/")
def index():
print("index is running!")
return "Hello world"

if __name__ == "__main__":
app.run()

In this setup, the while function will be executed, and, when it will be finished, your app will run, but I don't think that was what you were asking for?

How to trigger a function after return statement in Flask

One solution would be to have a background thread that will watch a queue. You put your csv data in the queue and the background thread will consume it. You can start such a thread before first request:

import threading
from multiprocessing import Queue

class CSVWriterThread(threading.Thread):
def __init__(self, *args, **kwargs):
threading.Thread.__init__(self, *args, **kwargs)
self.input_queue = Queue()

def send(self, item):
self.input_queue.put(item)

def close(self):
self.input_queue.put(None)
self.input_queue.join()

def run(self):
while True:
csv_array = self.input_queue.get()
if csv_array is None:
break

# Do something here ...
df = pd.DataFrame({'x': csv_array})
df.to_csv("docs/xyz.csv", index=False)

self.input_queue.task_done()
time.sleep(1)
# Done
self.input_queue.task_done()
return

@app.before_first_request
def activate_job_monitor():
thread = CSVWriterThread()
app.csvwriter = thread
thread.start()

And in your code put the message in the queue before returning:

@app.route("/test", methods=['GET','POST'])
def check():
arr.append(request.form['a'])
arr.append(request.form['b'])
res = {'Status': True}
app.csvwriter.send(arr)
return json.dumps(res)

Returning an immediate response in flask, but finishing the processing in a new thread

As Ardaglio said, the best way was using multithreading.

I didn't use Celery, because I think it's pretty complicated and my problem is quite easy for it.

So, I'm using Thread:

from threading import Thread

@app.route('/ocr/read_image', methods=['POST'])
def get_text():
Thread(target=continue_processing).start()
return jsonify('Success')

def continue_processing():
time.sleep(10)
print('Hi')

But, you gotta be careful. I'm using Keras with Tensorflow as Backend, and if you use it so, you will have a nice value error ValueError: Tensor Tensor()is not an element of this graph.

So, to avoid it inside a Thread, you've to save the Graph after the model is made:

GRAPH = tf.get_default_graph()

and then you've to use it inside the asynchron process this way:

with GRAPH.as_default():
do something with your model

Hope it could be help someone.

Flask end response and continue processing

Sadly teardown callbacks do not execute after the response has been returned to the client:

import flask
import time
app = flask.Flask("after_response")

@app.teardown_request
def teardown(request):
time.sleep(2)
print("teardown_request")

@app.route("/")
def home():
return "Success!\n"

if __name__ == "__main__":
app.run()

When curling this you'll note a 2s delay before the response displays, rather than the curl ending immediately and then a log 2s later. This is further confirmed by the logs:

teardown_request
127.0.0.1 - - [25/Jun/2018 15:41:51] "GET / HTTP/1.1" 200 -

The correct way to execute after a response is returned is to use WSGI middleware that adds a hook to the close method of the response iterator. This is not quite as simple as the teardown_request decorator, but it's still pretty straight-forward:

import traceback
from werkzeug.wsgi import ClosingIterator

class AfterResponse:
def __init__(self, app=None):
self.callbacks = []
if app:
self.init_app(app)

def __call__(self, callback):
self.callbacks.append(callback)
return callback

def init_app(self, app):
# install extension
app.after_response = self

# install middleware
app.wsgi_app = AfterResponseMiddleware(app.wsgi_app, self)

def flush(self):
for fn in self.callbacks:
try:
fn()
except Exception:
traceback.print_exc()

class AfterResponseMiddleware:
def __init__(self, application, after_response_ext):
self.application = application
self.after_response_ext = after_response_ext

def __call__(self, environ, start_response):
iterator = self.application(environ, start_response)
try:
return ClosingIterator(iterator, [self.after_response_ext.flush])
except Exception:
traceback.print_exc()
return iterator

Which you can then use like this:

@app.after_response
def after():
time.sleep(2)
print("after_response")

From the shell you will see the response return immediately and then 2 seconds later the after_response will hit the logs:

127.0.0.1 - - [25/Jun/2018 15:41:51] "GET / HTTP/1.1" 200 -
after_response

This is a summary of a previous answer provided here.

Is there a way to run a function in the background of a python flask app?

Thank you for your suggestions, as they definitely helped me research the topic, but I finally figured out how to get threading working.

Turns out my issue was returning the value to a list for the main program to use.

By providing the list notes_pressed as an argument to get_next_note(), it ensures that the list is updated when return_pressed_notes() is called.

Here is the code I used to solve my issue:

from Flask import Flask
import mido
from threading import Thread

app = Flask(__name__)

# List to store pressed keys
notes_pressed = []

@app.route('/get_notes', methods=['GET'])
def return_pressed_notes():
return json.dumps(notes_pressed)

# Function to translate midi key numbers to note letters
def translate_key(key_num):
...

# Function that returns recently played note
def get_next_note(notes_pressed):
# Open port to listen for note presses
with mido.open_input() as inport:
# Retreive key presses from port
for msg in inport:
# If key press is valid
# - note is pressed
# - velocity!=0 (prevents ghost notes)
if (msg.type=='note_on' and msg.velocity!=0 and msg.channel==0):
# Add new note to list
notes_pressed.append(translate_key(msg.note - lowest_key))

# Run main program
if __name__ == '__main__':
# NEW CODE
p = Thread(target=get_next_note, args=(notes_pressed,))
p.start()
app.run(debug=True, use_reloader=False)
p.join()

How to execute function after returning Flask response (hosted on Heroku)?

So what I've figured out is that that it's extremely easy to do and even easier on heroku the problem is the documentation is quite scattered around and for someone who's just discovering job queues it may be overwhelming.

For this example I'm going to use Reddis To Go addon on Heroku so the first thing you have to do is install it from your dashboard. After that you set up your flask app too look something like this:

from flask import Flask
from rq import Queue
from redis import Redis
import os
import urllib.parse as urlparse

app = Flask(__name__)

def function_to_queue():
return "finished"
# Tell RQ what Redis connection to use and parse url from the global variable that was added by the addon
redis_url = os.getenv('REDISTOGO_URL')
urlparse.uses_netloc.append('redis')
url = urlparse.urlparse(redis_url)
conn = Redis(host=url.hostname, port=url.port, db=0, password=url.password)
q = Queue(connection=conn) #no args implies the default queue

@app.route('/')
def hello():
ob = q.enqueue(function_to_queue) #Add previously defined function to queue
return "k?"
if __name__ == '__main__':
app.run()

Next you have to create python script called run-worker.py with the code below:

import os
import urllib.parse as urlparse
from redis import Redis
from rq import Worker, Queue, Connection

listen = ['high', 'default', 'low']

redis_url = os.getenv('REDISTOGO_URL')
if not redis_url:
raise RuntimeError('Set up Redis To Go first.')

urlparse.uses_netloc.append('redis')
url = urlparse.urlparse(redis_url)
conn = Redis(host=url.hostname, port=url.port, db=0, password=url.password)

with Connection(conn):
worker = Worker(map(Queue, listen))
worker.work()

Now just modify your Procfile on heroku to look like this:

web: gunicorn hello:app --log-file -
worker: python -u run-worker.py

Deploy this, make sure you have both worker and the app started... annnd you're done. Hope this helps others understanding job queueing faster.

How can I run a function after each request to a static resource in Flask?

Solved it by creating a blueprint and letting it do all the lifting for static files. I'll make a suggestion to Flask and Quart to add an official version of this feature. If you're using Flask, not Quart, then change all the async defs to def

static_bp.py:

from quart import Blueprint, request
import threading
import time
import os

static = Blueprint('static', __name__, static_url_path="/", static_folder="static")

@static.after_request
async def after_request_func(response):
if response.status_code == 200:
file_path = request.base_url.replace("http://ip:port/", "")
t = threading.Thread(target=delete_after_request_thread, args=[file_path])
t.setDaemon(False)
t.start()
return response

def delete_after_request_thread(file_path):
time.sleep(2000)
os.remove(file_path)

main.py (Replace Quart with Flask if you are running Flask):

app = Quart(__name__, "/static", static_folder=None)
app.register_blueprint(static, url_prefix='/static')


Related Topics



Leave a reply



Submit