"Fire and Forget" Python Async/Await

Fire and forget python async/await

Upd:

Replace asyncio.ensure_future with asyncio.create_task everywhere if you're using Python >= 3.7 It's a newer, nicer way to spawn tasks.



asyncio.Task to "fire and forget"

According to python docs for asyncio.Task it is possible to start some coroutine to execute "in the background". The task created by asyncio.ensure_future won't block the execution (therefore the function will return immediately!). This looks like a way to "fire and forget" as you requested.

import asyncio

async def async_foo():
print("async_foo started")
await asyncio.sleep(1)
print("async_foo done")

async def main():
asyncio.ensure_future(async_foo()) # fire and forget async_foo()

# btw, you can also create tasks inside non-async funcs

print('Do some actions 1')
await asyncio.sleep(1)
print('Do some actions 2')
await asyncio.sleep(1)
print('Do some actions 3')

if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

Output:

Do some actions 1
async_foo started
Do some actions 2
async_foo done
Do some actions 3

What if tasks are executing after the event loop has completed?

Note that asyncio expects tasks to be completed at the moment the event loop completes. So if you'll change main() to:

async def main():
asyncio.ensure_future(async_foo()) # fire and forget

print('Do some actions 1')
await asyncio.sleep(0.1)
print('Do some actions 2')

You'll get this warning after the program finished:

Task was destroyed but it is pending!
task: <Task pending coro=<async_foo() running at [...]

To prevent that you can just await all pending tasks after the event loop has completed:

async def main():
asyncio.ensure_future(async_foo()) # fire and forget

print('Do some actions 1')
await asyncio.sleep(0.1)
print('Do some actions 2')

if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

# Let's also finish all running tasks:
pending = asyncio.Task.all_tasks()
loop.run_until_complete(asyncio.gather(*pending))

Kill tasks instead of awaiting them

Sometimes you don't want to await tasks to be done (for example, some tasks may be created to run forever). In that case, you can just cancel() them instead of awaiting them:

import asyncio
from contextlib import suppress

async def echo_forever():
while True:
print("echo")
await asyncio.sleep(1)

async def main():
asyncio.ensure_future(echo_forever()) # fire and forget

print('Do some actions 1')
await asyncio.sleep(1)
print('Do some actions 2')
await asyncio.sleep(1)
print('Do some actions 3')

if __name__ == '__main__':
loop = asyncio.get_event_loop()
loop.run_until_complete(main())

# Let's also cancel all running tasks:
pending = asyncio.Task.all_tasks()
for task in pending:
task.cancel()
# Now we should await task to execute it's cancellation.
# Cancelled task raises asyncio.CancelledError that we can suppress:
with suppress(asyncio.CancelledError):
loop.run_until_complete(task)

Output:

Do some actions 1
echo
Do some actions 2
echo
Do some actions 3
echo

python fire and forget async function in background

Solved my issue by combining multithreading and async tasks as stated by @norbeq here.

How can I fire and forget a task without blocking main thread?

Your questions are so abstract that I'll try to give common answers to all of them.

How can I "fire and forget" a task without blocking main thread?

It depends on what you mean by saying forget.

  • If you are not planning to access that task after running, you can run it in a parallel process.
  • If the main application should be able to access a background task, then you should have an event-driven architecture. In that case, the things previously called tasks will be services or microservices.

I don't want to use any task queues (celery, rabbitmq, etc.) here because the tasks I'm thinking of are too small and fast to run. Just want to get them done as out of the way as possible. Would that be an async approach? Throwing them onto another process?

If it contains loops or other CPU-bound operations, then right to use a subprocess. If the task makes a request (async), reads files, logs to stdout, or other I/O bound operations, then it is right to use coroutines or threads.

Does it make sense to have a separate thread that handles background jobs? Like a simple job queue but very lightweight and does not require additional infrastructure?

We can't just use a thread as it can be blocked by another task that uses CPU-bound operations. Instead, we can run a background process and use pipes, queues, and events to communicate between processes. Unfortunately, we cannot provide complex objects between processes, but we can provide basic data structures to handle status changes of the tasks running in the background.

Regarding the Starlette and the BackgroundTask

Starlette is a lightweight ASGI framework/toolkit, which is ideal for building async web services in Python. (README description)

It is based on concurrency. So even this is not a generic solution for all kinds of tasks.
NOTE: Concurrency differs from parallelism.

I'm wondering if we can build something more generic where you can run background tasks in scripts or webservers alike, without sacrificing performance.

The above-mentioned solution suggests use a background process. Still, it will depend on the application design as you must do things (emit an event, add an indicator to the queue, etc.) that are needed for communication and synchronization of running processes (tasks). There is no generic tool for that, but there are situation-dependent solutions.

Situation 1 - The tasks are asynchronous functions

Suppose we have a request function that should call an API without blocking the work of other tasks. Also, we have a sleep function that should not block anything.

import asyncio
import aiohttp

async def request(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
try:
return await response.json()
except aiohttp.ContentTypeError:
return await response.read()

async def sleep(t):
await asyncio.sleep(t)

async def main():
background_task_1 = asyncio.create_task(request("https://google.com/"))
background_task_2 = asyncio.create_task(sleep(5))

... # here we can do even CPU-bound operations

result1 = await background_task_1

... # use the 'result1', etc.

await background_task_2

if __name__ == "__main__":
loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()

In this situation, we use asyncio.create_task to run a coroutine concurrently (like in the background). Sure we could run it in a subprocess, but there is no reason for that as it would use more resources without improving the performance.

Situation 2 - The tasks are synchronous functions (I/O bound)

Unlike the first situation where the functions were already asynchronous, in this situation, those are synchronous but not CPU-bound (I/O bound). This gives an ability to run them in threads or make them asynchronous (using asyncio.to_thread) and run concurrently.

import time
import asyncio
import requests

def asynchronous(func):
"""
This decorator converts a synchronous function to an asynchronous

Usage:
@asynchronous
def sleep(t):
time.sleep(t)

async def main():
await sleep(5)
"""

async def wrapper(*args, **kwargs):
await asyncio.to_thread(func, *args, **kwargs)

return wrapper

@asynchronous
def request(url):
with requests.Session() as session:
response = session.get(url)
try:
return response.json()
except requests.JSONDecodeError:
return response.text

@asynchronous
def sleep(t):
time.sleep(t)


async def main():
background_task_1 = asyncio.create_task(request("https://google.com/"))
background_task_2 = asyncio.create_task(sleep(5))
...

Here we used a decorator to convert a synchronous (I/O bound) function to an asynchronous one and use them like in the first situation.

Situation 3 - The tasks are synchronous functions (CPU-bound)

To run CPU-bound tasks parallelly in the background we have to use multiprocessing. And for ensuring the task is done we use the join method.

import time
import multiprocessing

def task():
for i in range(10):
time.sleep(0.3)

def main():
background_task = multiprocessing.Process(target=task)
background_task.start()

... # do the rest stuff that does not depend on the background task

background_task.join() # wait until the background task is done

... # do stuff that depends on the background task

if __name__ == "__main__":
main()

Suppose the main application depends on the parts of the background task. In this case, we need an event-driven design as the join cannot be called multiple times.

import multiprocessing

event = multiprocessing.Event()

def task():
... # synchronous operations

event.set() # notify the main function that the first part of the task is done

... # synchronous operations

event.set() # notify the main function that the second part of the task is also done

... # synchronous operations

def main():
background_task = multiprocessing.Process(target=task)
background_task.start()

... # do the rest stuff that does not depend on the background task

event.wait() # wait until the first part of the background task is done

... # do stuff that depends on the first part of the background task

event.wait() # wait until the second part of the background task is done

... # do stuff that depends on the second part of the background task

background_task.join() # wait until the background task is finally done

... # do stuff that depends on the whole background task

if __name__ == "__main__":
main()

As you already noticed with events we can just provide binary information and those are not effective if the processes are more than two (It will be impossible to know where the event was emitted from). So we use pipes, queues, and manager to provide non-binary information between the processes.

Asynchronous Python - fire and forget HTTP request

I have answered a rather similar question.

async def main():
asyncio.ensure_future(fire())

ensure_future schedules coro execution, but does not wait for its completion and run_until_complete does not wait for the completion of all futures.

This should fix it:

async def main():
await fire()

Fire, Forget, and Return Value in Python3.7

I managed to get it working using threading instead of asyncio:

import threading
import time

def Slowpoke():
print("I see you shiver with antici...")
time.sleep(3)
print("...pation")

def Rocky():
t = threading.Thread(name="thread", target=Slowpoke)
t.setDaemon(True)
t.start()
time.sleep(1)
return "HI!"

if __name__ == "__main__":
print(Rocky())
while True:
time.sleep(1)

Asynchronous Python server: fire and forget at startup

It is see these docs, in summary,

@app.before_serving
async def startup():
asyncio.ensure_future(master.resume())

I'd hold on to the task though, so that you can cancel it at shutdown,

@app.before_serving
async def startup():
app.background_task = asyncio.ensure_future(master.resume())

@app.after_serving
async def shutdown():
app.background_task.cancel() # Or something similar


Related Topics



Leave a reply



Submit