Skip to content

Make taskworker worker support concurrent work #80369

@markstory

Description

@markstory

The current worker implementation is single threaded and will have limited throughput. Expand the worker to support concurrent processing based on CLI flags. --concurrency=12 should create 12 worker threads/processes allowing a worker to process up to 12 tasks concurrently.

Rough thinking would be to have a single thread responsible for fetching tasks and sending rpc responses back and a pool of worker threads that execute tasks. Processing timeouts could be tricky to enforce though as python threads don't have a way to do that. Perhaps celery worker has some tricks we could replicate.

Metadata

Metadata

Assignees

Labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions