-
Notifications
You must be signed in to change notification settings - Fork 584
feat(pt): add eta message for pt backend #4725
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR adds an Estimated Time of Arrival (ETA) message for the PyTorch backend’s training logs and aligns the related formatting functions across different training modules.
- Added ETA computation and logging in deepmd/pt/train/training.py.
- Updated deepmd/pd/train/training.py to replace the deprecated _step_id with display_step_id and remove a -1 offset in the ETA calculation.
- Modified deepmd/loggers/training.py to support an optional ETA parameter in the training message.
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| deepmd/pt/train/training.py | Adds ETA calculation and includes it in log messages. |
| deepmd/pd/train/training.py | Updates ETA calculation by replacing _step_id with display_step_id and removing the -1 offset. |
| deepmd/loggers/training.py | Updates format_training_message to optionally include ETA information. |
Comments suppressed due to low confidence (1)
deepmd/pd/train/training.py:850
- The removal of the '-1' offset in the ETA calculation changes the expected result. Please confirm if this off-by-one adjustment was intentional.
(self.num_steps - display_step_id) / self.disp_freq * train_time
📝 WalkthroughWalkthroughThe changes introduce an optional ETA (estimated time of arrival) display to training progress logs. The Changes
Sequence Diagram(s)sequenceDiagram
participant Trainer
participant Logger
Trainer->>Logger: format_training_message(batch, wall_time, eta)
Logger-->>Trainer: Formatted message with optional ETA
Trainer->>Console: Output training progress message
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (3)
🧰 Additional context used🧬 Code Graph Analysis (2)deepmd/pt/train/training.py (1)
deepmd/pd/train/training.py (1)
⏰ Context from checks skipped due to timeout of 90000ms (30)
🔇 Additional comments (8)
✨ Finishing Touches
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## devel #4725 +/- ##
==========================================
- Coverage 84.81% 84.81% -0.01%
==========================================
Files 696 696
Lines 67264 67262 -2
Branches 3541 3540 -1
==========================================
- Hits 57047 57045 -2
Misses 9085 9085
Partials 1132 1132 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
add Estimated Time of Arrival for pytorch backend which is convenient when training model(to avoid affecting performance, synchronization was not used for timing, so it will not be very precise).
Summary by CodeRabbit