Conversation
This permits servers to allow "terminate" comms to persist when shutting down. Closes dask#6413
Unit Test Results 15 files ± 0 15 suites ±0 6h 41m 50s ⏱️ - 3m 45s For more details on these failures, see this check. Results for commit 2551272. ± Comparison against base commit 9bb999d. ♻️ This comment has been updated with latest results. |
| assert w.status in { | ||
| Status.closing, | ||
| Status.closed, | ||
| Status.failed, | ||
| }, w.status |
There was a problem hiding this comment.
This is because the scheduler.close does no longer use an explicit comm to close the remote worker.
This behavior was introduced in #6363
|
Seems to be fast on subsequent close calls In [1]: from dask.distributed import LocalCluster
In [2]: %%time
...: with LocalCluster(processes=False, n_workers=3):
...: pass
...:
CPU times: user 853 ms, sys: 120 ms, total: 972 ms
Wall time: 2.29 s
In [3]: %%time
...: with LocalCluster(processes=False, n_workers=3):
...: pass
...:
CPU times: user 106 ms, sys: 20.3 ms, total: 127 ms
Wall time: 132 ms👍 |
|
Im surprised to see the first one taking that long. In my tests, the first call took about 150ms and the second one 30ms |
|
timing for me Python 3.10.4 | packaged by conda-forge | (main, Mar 24 2022, 17:42:03) [Clang 12.0.1 ]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.2.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: from dask.distributed import LocalCluster
In [2]: %%time
...: with LocalCluster(processes=False, n_workers=0):
...: pass
...:
CPU times: user 237 ms, sys: 52.1 ms, total: 289 ms
Wall time: 358 ms
In [3]: %%time
...: with LocalCluster(processes=False, n_workers=0):
...: pass
...:
CPU times: user 20.5 ms, sys: 2.52 ms, total: 23 ms
Wall time: 22.9 msI just realize in #6415 (comment) there are three workers. Maybe the first call is slow/fast enough to actually get a worker process up such that the closing takes longer |
|
Woot
…On Mon, May 23, 2022 at 12:55 PM Florian Jetter ***@***.***> wrote:
timing for me
Python 3.10.4 | packaged by conda-forge | (main, Mar 24 2022, 17:42:03) [Clang 12.0.1 ]Type 'copyright', 'credits' or 'license' for more informationIPython 8.2.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: from dask.distributed import LocalCluster
In [2]: %%time
...: with LocalCluster(processes=False, n_workers=0):
...: pass
...:CPU times: user 237 ms, sys: 52.1 ms, total: 289 msWall time: 358 ms
In [3]: %%time
...: with LocalCluster(processes=False, n_workers=0):
...: pass
...:CPU times: user 20.5 ms, sys: 2.52 ms, total: 23 msWall time: 22.9 ms
—
Reply to this email directly, view it on GitHub
<#6415 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AACKZTEVIEVPN7EQOHV4HTLVLPA7PANCNFSM5WVGMKCQ>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
This supersedes #6414
closes #6413
xref closes #6365
Follow up of #4805 where I dealt with other ongoing handlers but didn't special case terminate / close
This offers a few benefits over #6414
cc @mrocklin @hendrikmakait