Skip to content

Added resource stealing benchmark#22

Merged
TomAugspurger merged 6 commits intodask:masterfrom
TomAugspurger:3069-task-stealing
Nov 7, 2019
Merged

Added resource stealing benchmark#22
TomAugspurger merged 6 commits intodask:masterfrom
TomAugspurger:3069-task-stealing

Conversation

@TomAugspurger
Copy link
Copy Markdown
Member

benchmarks the improvement on dask/distributed#3069.

Will verify that we have a benchmark for the common case of no restrictions.

@TomAugspurger TomAugspurger merged commit cf4faa8 into dask:master Nov 7, 2019
@TomAugspurger TomAugspurger deleted the 3069-task-stealing branch November 7, 2019 21:15

def setup(self):
cluster = LocalCluster(n_workers=1, threads_per_worker=1,
resources={"resource": 1}, worker_class=Worker)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For speed reasons we might want to use processes=False in cases like this. It also helps considerably with cleanup.

cluster = LocalCluster(n_workers=1, threads_per_worker=1,
resources={"resource": 1}, worker_class=Worker)
spec = copy.deepcopy(cluster.new_worker_spec())
del spec[1]['options']['resources']
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I encourage folks to take a look at using SpecCluster directly. It's not terrible to use manually, and make this sort of thing more explicit.

workers = list(info['workers'])
futures = client.map(slowinc, range(10),
delay=0.1, resources={"resource": 1})
client.cluster.scale(len(workers) + 1)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The 100ms cost here is likely to dwarf the task overhead that we're looking to find here, which is likely to be in the 10us range.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants