add multiprocessing support for Cross Encoder#3580
add multiprocessing support for Cross Encoder#3580tomaarsen merged 9 commits intohuggingface:mainfrom
Conversation
|
Well done! This is looking great in my early tests, with some speedups for e.g. multi-processing on CPUs. I'll have a deeper look at the code itself later, but I'd like to include this in the next release 🤗
|
|
hey, thank you! I shall include my test results here shortly. |
This will align better with my goals for the big refactor of huggingface#3554, where these methods will be called _multi_process and _multi_process_worker
Apologies for the delay. Indeed, a lot of the simple issues have been taken care of by now. The only extension I can think of is to upgrade I've added some tests, and will be having a closer look as I intend to include this in the upcoming v5.2 release.
|
|
I think this is in a good spot now, what do you think?
|
|
Yes. looks good. I think the 2 failing tests are because of rate limiting? thanks |
They are, no worries there.
|
Resolves #3350
Implemented 4 functions to support multiprocessing, exactly as done in
SentenceTransformer.pystart_multi_process_pool- same as beforestop_multi_process_pool- same as before_predict_multi_process- similar to_encode_multi_process_predict_multi_process_worker- similar to_encode_multi_process_workerSo far, multiprocessing support has been added to both
predictandrankmethods ofCrossEncoder.py.I've included code for raising Error if any of the worker processes result in an error. Can remove this after testing.
I've tested this on
macOS, but need to test on Colab GPU and confirm.@tomaarsen some functions like
_encode_multi_process_workerand_encode_multi_processdo not have docstrings and params. Should I add them in this PR itself?