Handle Flask Requests Concurrently with Threaded=True

Handle Flask requests concurrently with threaded=True

As of Flask 1.0, the WSGI server included with Flask is run in threaded mode by default.

Prior to 1.0, or if you disable threading, the server is run in single-threaded mode, and can only handle one request at a time. Any parallel requests will have to wait until they can be handled, which can lead to issues if you tried to contact your own server from a request.

With threaded=True requests are each handled in a new thread. How many threads your server can handle concurrently depends entirely on your OS and what limits it sets on the number of threads per process. The implementation uses the SocketServer.ThreadingMixIn class, which sets no limits to the number of threads it can spin up.

Note that the Flask server is designed for development only. It is not a production-ready server. Don't rely on it to run your site on the wider web. Use a proper WSGI server (like gunicorn or uWSGI) instead.

How many concurrent requests does a single Flask process receive?

When running the development server - which is what you get by running app.run(), you get a single synchronous process, which means at most 1 request is being processed at a time.

By sticking Gunicorn in front of it in its default configuration and simply increasing the number of --workers, what you get is essentially a number of processes (managed by Gunicorn) that each behave like the app.run() development server. 4 workers == 4 concurrent requests. This is because Gunicorn uses its included sync worker type by default.

It is important to note that Gunicorn also includes asynchronous workers, namely eventlet and gevent (and also tornado, but that's best used with the Tornado framework, it seems). By specifying one of these async workers with the --worker-class flag, what you get is Gunicorn managing a number of async processes, each of which managing its own concurrency. These processes don't use threads, but instead coroutines. Basically, within each process, still only 1 thing can be happening at a time (1 thread), but objects can be 'paused' when they are waiting on external processes to finish (think database queries or waiting on network I/O).

This means, if you're using one of Gunicorn's async workers, each worker can handle many more than a single request at a time. Just how many workers is best depends on the nature of your app, its environment, the hardware it runs on, etc. More details can be found on Gunicorn's design page and notes on how gevent works on its intro page.

how does flask handle simultaneous requests?

Flask doesn't. Parallel request handling is the task of the underlying WSGI web server, which sends the requests to Flask for handling.

Flask's built-in development server which is invoked with Flask.run() runs with threads by default

In production, you'd use one of the WSGI containers or other deployment options, and you control parallelism there. Gunicorn, for example, has the -w command line argument which controls the number of worker processes, and -k which controls how these workers work (processes, threads, or a Tornado event machine, among others).

python flask threaded true not working

From the flask documentation about Deployment Options

While lightweight and easy to use, Flask’s built-in server is not suitable for production as it doesn’t scale well and by default serves only one request at a time. Some of the options available for properly running Flask in production are documented here.

That's why your second request isn't happening until the first is complete, because the flask server on it's own can only handle one request at a time. To solve this you will need to run Flask on some kind of deployment server for example gunicorn or uWSGI which seem to be the most popular.

You also might find the answer to this or this question helpful.
Deployment Options also has a lot of links to guides and information about different ways of solving your issue.

Can I serve multiple clients using just Flask app.run() as standalone?

flask.Flask.run accepts additional keyword arguments (**options) that it forwards to werkzeug.serving.run_simple - two of those arguments are threaded (a boolean) and processes (which you can set to a number greater than one to have werkzeug spawn more than one process to handle requests).

threaded defaults to True as of Flask 1.0, so for the latest versions of Flask, the default development server will be able to serve multiple clients simultaneously by default. For older versions of Flask, you can explicitly pass threaded=True to enable this behaviour.

For example, you can do

if __name__ == '__main__':
app.run(threaded=True)

to handle multiple clients using threads in a way compatible with old Flask versions, or

if __name__ == '__main__':
app.run(threaded=False, processes=3)

to tell Werkzeug to spawn three processes to handle incoming requests, or just

if __name__ == '__main__':
app.run()

to handle multiple clients using threads if you know that you will be using Flask 1.0 or later.

That being said, Werkzeug's serving.run_simple wraps the standard library's wsgiref package - and that package contains a reference implementation of WSGI, not a production-ready web server. If you are going to use Flask in production (assuming that "production" is not a low-traffic internal application with no more than 10 concurrent users) make sure to stand it up behind a real web server (see the section of Flask's docs entitled Deployment Options for some suggested methods).

Handling multiple requests in Flask

Yes, deploy your application on a different WSGI server, see the Flask deployment options documentation.

The server component that comes with Flask is really only meant for when you are developing your application; even though it can be configured to handle concurrent requests with app.run(threaded=True) (as of Flask 1.0 this is the default). The above document lists several options for servers that can handle concurrent requests and are far more robust and tuneable.



Related Topics



Leave a reply



Submit