-
-
Notifications
You must be signed in to change notification settings - Fork 756
Closed
Description
What happened:
An exception on Nannys plugins= doesn't cause the dask-worker process to terminate after #5910 , instead it hangs indefinitely.
What you expected to happen:
dask-worker process to terminate
Minimal Complete Verifiable Example:
Unfortunately, this isn't something that is implemented in dask-worker today, but a patch to reproduce this can be found below:
diff --git a/distributed/cli/dask_worker.py b/distributed/cli/dask_worker.py
index 376f2a1c..4f4172a9 100755
--- a/distributed/cli/dask_worker.py
+++ b/distributed/cli/dask_worker.py
@@ -36,6 +36,11 @@ logger = logging.getLogger("distributed.dask_worker")
pem_file_option_type = click.Path(exists=True, resolve_path=True)
+class MyPlugin:
+ def setup(self, workers=None):
+ raise ValueError("Kill dask-worker")
+
+
@click.command(context_settings=dict(ignore_unknown_options=True))
@click.argument("scheduler", type=str, required=False)
@click.option(
@@ -444,6 +449,7 @@ def main(
host=host,
dashboard=dashboard,
dashboard_address=dashboard_address,
+ plugins={MyPlugin()},
name=name
if n_workers == 1 or name is None or name == ""
else str(name) + "-" + str(i),To then reproduce, one would run the following CLIs:
# Scheduler
$ dask-scheduler
# Worker
$ dask-worker --nanny 127.0.0.1:8786
Anything else we need to know?:
This behavior is relied upon in Dask-CUDA. #5910 caused https://github.com/rapidsai/dask-cuda/blob/7de73c72ca52239c6af87e483a20af3c8896bf0d/dask_cuda/tests/test_dask_cuda_worker.py#L220-L228 to hang indefinitely.
Environment:
- Dask version: 2022.5.0+0.gc11c8ee4
- Python version: 3.8
- Operating System: Linux
- Install method (conda, pip, source): source
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels