Deploying Flask, parallel requests

When I test my new Flask application with the built in web server, everything is “single threaded” and blocking. The server can not serve one request without finishing another. It can only process one request at a time.

When deploying a web service, this is obviously not desirable. How do you deploy Flask applications so that things can move in parallel?

Are there different things to consider regarding thread safety and concurrency inside the code (protect objects with locks and so on) or are all the offerings equivalent?

Best answer

I use uWSGI with the gevent loop. That is the ticket. In fact, this is how I use py-redis which is blocking to not be blocking.

Also, I use uWSGI to write requests after the response while still accepting more requests.