Automatically remove run-benchmarks label once benchmarks are executed#6929
Merged
jni merged 1 commit intonapari:mainfrom May 27, 2024
Merged
Automatically remove run-benchmarks label once benchmarks are executed#6929jni merged 1 commit intonapari:mainfrom
jni merged 1 commit intonapari:mainfrom
Conversation
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #6929 +/- ##
==========================================
- Coverage 92.48% 92.43% -0.06%
==========================================
Files 612 612
Lines 55165 55165
==========================================
- Hits 51018 50990 -28
- Misses 4147 4175 +28 ☔ View full report in Codecov by Sentry. |
brisvag
approved these changes
May 24, 2024
jni
approved these changes
May 27, 2024
andy-sweet
pushed a commit
to andy-sweet/napari
that referenced
this pull request
May 28, 2024
napari#6929) # Description Iterating on a performance PR that needs benchmarks is currently more cumbersome than it needs to be: add the run-benchmarks label, get a performance report, act on the report (ie make code changes and push), then, to run the benchmarks again, a core dev needs to *manually remove* the label, then immediately add it again. Instead, this PR automates the removal of the label once the benchmarks are done running and the report has been generated. In fact, it should also remove the label even if the benchmarks fail due to some bug in the code or benchmark code.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Iterating on a performance PR that needs benchmarks is currently more cumbersome than it needs to be: add the run-benchmarks label, get a performance report, act on the report (ie make code changes and push), then, to run the benchmarks again, a core dev needs to manually remove the label, then immediately add it again. Instead, this PR automates the removal of the label once the benchmarks are done running and the report has been generated. In fact, it should also remove the label even if the benchmarks fail due to some bug in the code or benchmark code.