Job queues in python with asyncio and redis


Keywords
aiorq, asyncio, python3, queue, redis, rq
License
MIT
Install
pip install aiorq==1.1.9

Documentation

Aiorq

Introduction

Aiorq is a distributed task queue with asyncio and redis, which rewrite from arq to make improvement and include web interface.

See documentation for more details.

Requirements

  • redis >= 5.0
  • aioredis >= 2.0.0

Install

pip install aiorq

Quick Start

Task Definition

# demo.py
# -*- coding: utf-8 -*-

import asyncio
import os

from aiorq.connections import RedisSettings
from aiorq.cron import cron


async def say_hello(ctx, name) -> None:
    await asyncio.sleep(5)
    print(f"Hello {name}")


async def startup(ctx):
    print("starting... done")


async def shutdown(ctx):
    print("ending... done")


async def run_cron(ctx, name_):
    return f"hello {name_}"


class WorkerSettings:
    redis_settings = RedisSettings(
        host=os.getenv("REDIS_HOST", "127.0.0.1"),
        port=os.getenv("REDIS_PORT", 6379),
        database=os.getenv("REDIS_DATABASE", 0),
        password=os.getenv("REDIS_PASSWORD", None)
    )

    functions = [say_hello]

    on_startup = startup

    on_shutdown = shutdown

    cron_jobs = [
        cron(coroutine=run_cron, kwargs={"name_":"pai"}, hour={17, 12, 18}, minute=40, second=50, keep_result_forever=True)
    ]

    # allow_abort_jobs = True
    # worker_name = "ohuo"
    # queue_name = "ohuo"

Run aiorq worker

> aiorq tasks.WorkerSettings
15:08:50: Starting Queue: ohuo
15:08:50: Starting Worker: ohuo@04dce85c-1798-43eb-89d8-7c6d78919feb
15:08:50: Starting Functions: say_hello, EnHeng
15:08:50: redis_version=5.0.10 mem_usage=731.12K clients_connected=2 db_keys=9
starting...

Integration in FastAPI

Dashboard

See aiorq dashboard for more details.

Thanks

License

MIT