-
Notifications
You must be signed in to change notification settings - Fork 584
doc: fix inconsistency between the docstring and the implementation of argument auto_batch_size of DeepEval with paddle and pytorch backend
#4865
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…f argument `auto_batch_size` of `DeepEval` with paddle and pytorch backend
📝 WalkthroughWalkthroughThe default value for the Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes Suggested reviewers
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (2)
💤 Files with no reviewable changes (2)
✨ Finishing Touches🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR fixes documentation inconsistency by updating the default value for the auto_batch_size parameter in DeepEval class docstrings across three backend implementations.
- Updates docstring default value from
FalsetoTrueforauto_batch_sizeparameter - Ensures consistency between documentation and actual implementation behavior
- Applies the fix across PyTorch, Paddle, and dpmodel backends
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.
| File | Description |
|---|---|
| deepmd/pt/infer/deep_eval.py | Updates docstring default value for auto_batch_size parameter in PyTorch backend |
| deepmd/pd/infer/deep_eval.py | Updates docstring default value for auto_batch_size parameter in Paddle backend |
| deepmd/dpmodel/infer/deep_eval.py | Updates docstring default value for auto_batch_size parameter in dpmodel backend |
for more information, see https://pre-commit.ci
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## devel #4865 +/- ##
==========================================
- Coverage 84.79% 84.71% -0.08%
==========================================
Files 698 699 +1
Lines 67817 68126 +309
Branches 3541 3541
==========================================
+ Hits 57505 57716 +211
- Misses 9178 9275 +97
- Partials 1134 1135 +1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
102a2ed
…f argument `auto_batch_size` of `DeepEval` with paddle and pytorch backend (deepmodeling#4865) <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **New Features** * Automatic batch size handling is now enabled by default when creating a new DeepEval instance. * **Documentation** * Improved formatting consistency in model information and multi-task training guides by removing unnecessary blank lines. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: glwan <wanguolin@dp.tech> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Summary by CodeRabbit
New Features
Documentation