Skip to content

add multiprocessing support for Cross Encoder#3580

Merged
tomaarsen merged 9 commits intohuggingface:mainfrom
omkar-334:multi_cross_encoder
Dec 5, 2025
Merged

add multiprocessing support for Cross Encoder#3580
tomaarsen merged 9 commits intohuggingface:mainfrom
omkar-334:multi_cross_encoder

Conversation

@omkar-334
Copy link
Copy Markdown
Contributor

@omkar-334 omkar-334 commented Nov 22, 2025

Resolves #3350

Implemented 4 functions to support multiprocessing, exactly as done in SentenceTransformer.py

  1. start_multi_process_pool - same as before
  2. stop_multi_process_pool - same as before
  3. _predict_multi_process - similar to _encode_multi_process
  4. _predict_multi_process_worker - similar to _encode_multi_process_worker

So far, multiprocessing support has been added to both predict and rank methods of CrossEncoder.py.

I've included code for raising Error if any of the worker processes result in an error. Can remove this after testing.
I've tested this on macOS, but need to test on Colab GPU and confirm.

@tomaarsen some functions like _encode_multi_process_worker and _encode_multi_process do not have docstrings and params. Should I add them in this PR itself?

@omkar-334 omkar-334 marked this pull request as ready for review November 24, 2025 10:25
@tomaarsen
Copy link
Copy Markdown
Member

Well done! This is looking great in my early tests, with some speedups for e.g. multi-processing on CPUs. I'll have a deeper look at the code itself later, but I'd like to include this in the next release 🤗

  • Tom Aarsen

@omkar-334
Copy link
Copy Markdown
Contributor Author

hey, thank you! I shall include my test results here shortly.
Other than that, are there any issues or features at the top of your head that I can work upon? I've scoured through the open issues and most look resolved..
thanks!

This will align better with my goals for the big refactor of huggingface#3554, where these methods will be called _multi_process and _multi_process_worker
@tomaarsen
Copy link
Copy Markdown
Member

Other than that, are there any issues or features at the top of your head that I can work upon? I've scoured through the open issues and most look resolved..

Apologies for the delay. Indeed, a lot of the simple issues have been taken care of by now. The only extension I can think of is to upgrade mine_hard_negatives with multi-GPU reranking on top of this PR.

I've added some tests, and will be having a closer look as I intend to include this in the upcoming v5.2 release.

  • Tom Aarsen

@tomaarsen
Copy link
Copy Markdown
Member

I think this is in a good spot now, what do you think?

  • Tom Aarsen

@omkar-334
Copy link
Copy Markdown
Contributor Author

Yes. looks good. I think the 2 failing tests are because of rate limiting?
Also, can you check out #3517

thanks

@tomaarsen
Copy link
Copy Markdown
Member

Yes. looks good. I think the 2 failing tests are because of rate limiting?
Also, can you check out #3517

They are, no worries there.
And I will, I've been focusing on my refactor instead of the PRs/issues lately, apologies.

  • Tom Aarsen

@tomaarsen tomaarsen merged commit 3f80d1c into huggingface:main Dec 5, 2025
15 of 17 checks passed
@omkar-334 omkar-334 deleted the multi_cross_encoder branch December 5, 2025 13:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Hard examples mining with multi GPU

2 participants