ThreadedExecutor

class chancy.executors.thread.ThreadedExecutor(worker, queue)[source]

Bases: ConcurrentExecutor

An Executor which uses a thread pool to run its jobs.

This executor is useful for running I/O-bound jobs concurrently without the overhead of separate processes. It’s not suitable for CPU-bound tasks due to Python’s Global SubInterpreter Lock (GIL).

When working with existing asyncio code, it’s often easier and more efficient to use the AsyncExecutor instead, as it can run a very large number of jobs concurrently.

To use this executor, simply pass the import path to this class in the executor field of your queue configuration or use the Executor shortcut:

async with Chancy("postgresql://localhost/postgres") as chancy:
    await chancy.declare(
        Queue(
            name="default",
            executor=Chancy.Executor.Threaded,
        )
    )
Parameters:
  • worker – The worker instance associated with this executor.

  • queue – The queue that this executor is associated with.

get_default_concurrency() int[source]

Get the default concurrency level for this executor.

This method is called when the queue’s concurrency level is set to None. It should return the number of jobs that can be processed concurrently by this executor.

On Python 3.13+, defaults to the number of logical CPUs on the system plus 4. On older versions of Python, defaults to the number of CPUs on the system plus 4. This mimics the behavior of Python’s built-in ThreadPoolExecutor.

job_wrapper(job: QueuedJob) tuple[QueuedJob, Any][source]

This is the function that is actually started by the thread pool executor. It’s responsible for setting up necessary limits, running the job, and returning the result.

async push(job: QueuedJob) Future[source]

Push a job onto the job pool.

async stop()[source]

Stop the executor, giving it a chance to clean up any resources it may have allocated to running jobs.

It is not safe to use the executor after this method has been called.