-
Notifications
You must be signed in to change notification settings - Fork 584
fix: get correct intensive property prediction when using virtual atoms #4869
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Note Other AI code review bot(s) detectedCodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review. 📝 WalkthroughWalkthroughPropagates an optional mask through model forward paths and output transformation so intensive, reducible outputs use a masked mean (sum / sum(mask)) when mask is present; adds small DPModel helper methods and introduces PyTorch/NumPy/JAX tests validating padding invariance. Changes
Sequence Diagram(s)sequenceDiagram
participant Caller
participant Model
participant AtomicNet
participant Transform
Caller->>Model: forward(input)
Model->>AtomicNet: forward_common_atomic(...)
AtomicNet-->>Model: atomic_ret (outputs, maybe mask)
Model->>Transform: fit_output_to_model_output(fit_ret, fit_output_def, coord_ext, ..., mask=atomic_ret["mask"] or None)
Transform->>Transform: if vdef.intensive and mask -> sum(v, axis=atoms)/sum(mask, axis=-1, keepdims=True)
Transform->>Transform: else if vdef.intensive -> mean(v, axis=atoms)
Transform-->>Model: reduced outputs
Model-->>Caller: final outputs
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related PRs
Suggested reviewers
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (29)
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
njzjz
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please make the same change to other backends
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
deepmd/pt/model/model/make_model.py (1)
302-302: Use.get()method for cleaner code.The functionality is correct, but the conditional expression can be simplified using the
.get()method.- mask=atomic_ret["mask"] if "mask" in atomic_ret else None, + mask=atomic_ret.get("mask", None),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
deepmd/pt/model/model/make_model.py(1 hunks)deepmd/pt/model/model/transform_output.py(2 hunks)source/tests/pt/test_dp_test.py(2 hunks)
🧰 Additional context used
🧠 Learnings (2)
📚 Learning: when reviewing prs, avoid making refactor suggestions that are not directly related to the pr's chan...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4226
File: deepmd/dpmodel/atomic_model/base_atomic_model.py:202-202
Timestamp: 2024-10-16T21:49:57.401Z
Learning: When reviewing PRs, avoid making refactor suggestions that are not directly related to the PR's changes. For example, in `deepmd/dpmodel/atomic_model/base_atomic_model.py`, do not suggest simplifying `for kk in ret_dict.keys()` to `for kk in ret_dict` unless it's relevant to the PR.
Applied to files:
deepmd/pt/model/model/make_model.pydeepmd/pt/model/model/transform_output.py
📚 Learning: refactoring between test classes `testinferdeeppotdpapt` and `testinferdeeppotdpaptnopbc` is address...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-09-19T04:25:12.408Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
Applied to files:
source/tests/pt/test_dp_test.py
🪛 Ruff (0.12.2)
deepmd/pt/model/model/make_model.py
302-302: Use atomic_ret.get("mask", None) instead of an if block
Replace with atomic_ret.get("mask", None)
(SIM401)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (29)
- GitHub Check: Test Python (3, 3.9)
- GitHub Check: Test Python (4, 3.12)
- GitHub Check: Test Python (5, 3.12)
- GitHub Check: Test Python (1, 3.12)
- GitHub Check: Test Python (5, 3.9)
- GitHub Check: Test Python (6, 3.12)
- GitHub Check: Test Python (2, 3.9)
- GitHub Check: Test Python (2, 3.12)
- GitHub Check: Test Python (3, 3.12)
- GitHub Check: Test Python (6, 3.9)
- GitHub Check: Test Python (1, 3.9)
- GitHub Check: Test Python (4, 3.9)
- GitHub Check: Test C++ (true)
- GitHub Check: Test C++ (false)
- GitHub Check: Build wheels for cp310-manylinux_aarch64
- GitHub Check: Build wheels for cp311-win_amd64
- GitHub Check: Build wheels for cp311-macosx_x86_64
- GitHub Check: Build wheels for cp311-manylinux_x86_64
- GitHub Check: Build wheels for cp311-macosx_arm64
- GitHub Check: Build wheels for cp311-manylinux_x86_64
- GitHub Check: Analyze (python)
- GitHub Check: Analyze (c-cpp)
- GitHub Check: Build C++ (cpu, cpu)
- GitHub Check: Build C++ (cuda, cuda)
- GitHub Check: Build C++ (clang, clang)
- GitHub Check: Build C++ (cuda120, cuda)
- GitHub Check: Build C++ (rocm, rocm)
- GitHub Check: Build C library (2.14, >=2.5.0,<2.15, libdeepmd_c_cu11.tar.gz)
- GitHub Check: Build C library (2.18, libdeepmd_c.tar.gz)
🔇 Additional comments (4)
source/tests/pt/test_dp_test.py (2)
16-16: Good addition for tensor padding operations.The import of
torch.nn.functionalasFis appropriately added to support the tensor padding operations in the new test.
190-210: Excellent test coverage for padding atoms handling.This test effectively validates the PR objective by:
- Testing multiple padding sizes (1, 5, 10) for robustness
- Using appropriate padding values (-1 for atom types, 0 for coordinates)
- Ensuring model outputs remain consistent when padding atoms are added
- Directly testing the intensive property prediction fix
The test logic correctly verifies that virtual/padding atoms do not affect the model's intensive property calculations.
deepmd/pt/model/model/transform_output.py (2)
161-161: Well-designed function signature enhancement.The addition of the optional
maskparameter maintains backward compatibility while enabling the new functionality to handle virtual/padding atoms correctly.
176-189: Excellent implementation of masked mean for intensive properties.This implementation correctly addresses the PR objective by:
- Proper detection: Checks if mask contains padding atoms (
mask == 0.0)- Correct averaging: Computes mean only over real atoms (
mask[ii]) per batch element- Fallback behavior: Uses original mean calculation when no padding is present
- Type safety: Converts mask to boolean dtype for proper tensor indexing
The logic ensures that intensive properties like energy per atom are calculated correctly by excluding virtual atoms from the averaging process, which was the core issue described in the PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
deepmd/pt/model/model/make_model.py (1)
302-302: LGTM! Consider using dict.get() for cleaner code.The mask parameter correctly enables exclusion of virtual atoms from intensive property calculations. The static analysis suggestion is valid:
- mask=atomic_ret["mask"] if "mask" in atomic_ret else None, + mask=atomic_ret.get("mask", None),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
deepmd/pt/model/model/make_model.py(1 hunks)deepmd/pt/model/model/transform_output.py(2 hunks)source/tests/pt/test_dp_test.py(2 hunks)
🧰 Additional context used
🧠 Learnings (3)
📓 Common learnings
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4226
File: deepmd/dpmodel/atomic_model/base_atomic_model.py:202-202
Timestamp: 2024-10-16T21:49:57.401Z
Learning: When reviewing PRs, avoid making refactor suggestions that are not directly related to the PR's changes. For example, in `deepmd/dpmodel/atomic_model/base_atomic_model.py`, do not suggest simplifying `for kk in ret_dict.keys()` to `for kk in ret_dict` unless it's relevant to the PR.
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-09-19T04:25:12.408Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-10-08T15:32:11.479Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
📚 Learning: when reviewing prs, avoid making refactor suggestions that are not directly related to the pr's chan...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4226
File: deepmd/dpmodel/atomic_model/base_atomic_model.py:202-202
Timestamp: 2024-10-16T21:49:57.401Z
Learning: When reviewing PRs, avoid making refactor suggestions that are not directly related to the PR's changes. For example, in `deepmd/dpmodel/atomic_model/base_atomic_model.py`, do not suggest simplifying `for kk in ret_dict.keys()` to `for kk in ret_dict` unless it's relevant to the PR.
Applied to files:
deepmd/pt/model/model/make_model.pydeepmd/pt/model/model/transform_output.py
📚 Learning: refactoring between test classes `testinferdeeppotdpapt` and `testinferdeeppotdpaptnopbc` is address...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-09-19T04:25:12.408Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
Applied to files:
source/tests/pt/test_dp_test.py
🪛 Ruff (0.12.2)
deepmd/pt/model/model/make_model.py
302-302: Use atomic_ret.get("mask", None) instead of an if block
Replace with atomic_ret.get("mask", None)
(SIM401)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (15)
- GitHub Check: Test Python (3, 3.9)
- GitHub Check: Test Python (4, 3.12)
- GitHub Check: Test Python (5, 3.12)
- GitHub Check: Test Python (1, 3.12)
- GitHub Check: Test Python (5, 3.9)
- GitHub Check: Test Python (6, 3.12)
- GitHub Check: Test Python (2, 3.9)
- GitHub Check: Test Python (2, 3.12)
- GitHub Check: Test Python (3, 3.12)
- GitHub Check: Test Python (6, 3.9)
- GitHub Check: Test Python (1, 3.9)
- GitHub Check: Test Python (4, 3.9)
- GitHub Check: Test C++ (true)
- GitHub Check: Test C++ (false)
- GitHub Check: Analyze (c-cpp)
🔇 Additional comments (4)
source/tests/pt/test_dp_test.py (2)
16-16: LGTM!The import is necessary for the padding operations in the new test method.
190-210: Excellent test coverage for the virtual atoms fix.This test effectively validates that the model output remains invariant when padding atoms are added, which directly confirms the fix for intensive property calculations. The test:
- Uses proper padding values (-1 for atom types, 0 for coordinates)
- Tests multiple padding sizes for robustness
- Correctly verifies output invariance
deepmd/pt/model/model/transform_output.py (2)
161-161: LGTM!The optional mask parameter is correctly added with proper type annotation and default value.
176-189: Perfect implementation of the intensive property fix.This code correctly addresses the core issue described in the PR:
- Proper padding detection: Checks for mask existence and padding atoms (mask == 0.0)
- Correct mean calculation: Computes mean only over real atoms per batch element
- Maintains compatibility: Falls back to original behavior when no masking is needed
The fix ensures that intensive properties are calculated as (sum of real atom contributions) / (number of real atoms), excluding virtual atoms from the denominator as intended.
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## devel #4869 +/- ##
=======================================
Coverage 84.29% 84.29%
=======================================
Files 702 702
Lines 68648 68666 +18
Branches 3573 3573
=======================================
+ Hits 57866 57882 +16
- Misses 9642 9643 +1
- Partials 1140 1141 +1 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
I have change codes in |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (2)
deepmd/dpmodel/model/make_model.py (1)
358-358: LGTM! Correct implementation of mask passing for intensive property handling.The change properly extracts and passes the mask from atomic results to the output transformation function, which is essential for correct intensive property calculation when virtual atoms are present.
The static analysis tool suggests using
atomic_ret.get("mask", None)instead of the conditional, which would be slightly more concise:-mask=atomic_ret["mask"] if "mask" in atomic_ret else None, +mask=atomic_ret.get("mask", None),deepmd/jax/model/base_model.py (1)
49-62: LGTM! Correct implementation of intensive property handling with mask support.This change properly addresses the core issue described in the PR objectives. The logic correctly:
- Uses mean reduction for intensive properties instead of sum
- Applies mask-aware averaging to exclude virtual/padding atoms from the denominator
- Handles both masked and unmasked scenarios appropriately
- Preserves existing sum behavior for extensive properties
This ensures that intensive properties like atomic averages are calculated correctly when virtual atoms are present.
The static analysis tool suggests using
atomic_ret.get("mask", None)instead of the conditional for slightly cleaner code:-mask = atomic_ret["mask"] if "mask" in atomic_ret else None +mask = atomic_ret.get("mask", None)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (9)
deepmd/dpmodel/fitting/property_fitting.py(2 hunks)deepmd/dpmodel/model/dp_model.py(1 hunks)deepmd/dpmodel/model/make_model.py(1 hunks)deepmd/dpmodel/model/property_model.py(1 hunks)deepmd/dpmodel/model/transform_output.py(3 hunks)deepmd/jax/model/base_model.py(1 hunks)source/tests/common/dpmodel/test_padding_atoms.py(1 hunks)source/tests/jax/test_padding_atoms.py(1 hunks)source/tests/pt/test_padding_atoms.py(1 hunks)
🧰 Additional context used
🧠 Learnings (6)
📓 Common learnings
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4226
File: deepmd/dpmodel/atomic_model/base_atomic_model.py:202-202
Timestamp: 2024-10-16T21:49:57.401Z
Learning: When reviewing PRs, avoid making refactor suggestions that are not directly related to the PR's changes. For example, in `deepmd/dpmodel/atomic_model/base_atomic_model.py`, do not suggest simplifying `for kk in ret_dict.keys()` to `for kk in ret_dict` unless it's relevant to the PR.
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-10-08T15:32:11.479Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-09-19T04:25:12.408Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
📚 Learning: when reviewing prs, avoid making refactor suggestions that are not directly related to the pr's chan...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4226
File: deepmd/dpmodel/atomic_model/base_atomic_model.py:202-202
Timestamp: 2024-10-16T21:49:57.401Z
Learning: When reviewing PRs, avoid making refactor suggestions that are not directly related to the PR's changes. For example, in `deepmd/dpmodel/atomic_model/base_atomic_model.py`, do not suggest simplifying `for kk in ret_dict.keys()` to `for kk in ret_dict` unless it's relevant to the PR.
Applied to files:
deepmd/jax/model/base_model.pydeepmd/dpmodel/model/make_model.pydeepmd/dpmodel/model/transform_output.py
📚 Learning: refactoring between test classes `testinferdeeppotdpapt` and `testinferdeeppotdpaptnopbc` is address...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4144
File: source/api_cc/tests/test_deeppot_dpa_pt.cc:166-246
Timestamp: 2024-10-08T15:32:11.479Z
Learning: Refactoring between test classes `TestInferDeepPotDpaPt` and `TestInferDeepPotDpaPtNopbc` is addressed in PR #3905.
Applied to files:
source/tests/pt/test_padding_atoms.py
📚 Learning: in `deepmd/dpmodel/fitting/general_fitting.py`, when using the array api and `array_api_compat`, the...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4204
File: deepmd/dpmodel/fitting/general_fitting.py:426-426
Timestamp: 2024-10-10T22:46:03.419Z
Learning: In `deepmd/dpmodel/fitting/general_fitting.py`, when using the Array API and `array_api_compat`, the `astype` method is not available as an array method. Instead, use `xp.astype()` from the array namespace for type casting.
Applied to files:
deepmd/dpmodel/model/transform_output.py
📚 Learning: in `deepmd/dpmodel/array_api.py`, the `__array_api_version__` attribute is guaranteed by the array a...
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4406
File: deepmd/dpmodel/array_api.py:51-53
Timestamp: 2024-11-23T00:01:06.984Z
Learning: In `deepmd/dpmodel/array_api.py`, the `__array_api_version__` attribute is guaranteed by the Array API standard to always be present, so error handling for its absence is not required.
Applied to files:
deepmd/dpmodel/model/transform_output.py
📚 Learning: in the array api, `outer` is only available in `xp.linalg`, not in the main namespace `xp`....
Learnt from: njzjz
PR: deepmodeling/deepmd-kit#4160
File: deepmd/dpmodel/utils/nlist.py:292-321
Timestamp: 2024-10-08T15:32:11.479Z
Learning: In the Array API, `outer` is only available in `xp.linalg`, not in the main namespace `xp`.
Applied to files:
deepmd/dpmodel/model/transform_output.py
🧬 Code Graph Analysis (3)
deepmd/dpmodel/model/dp_model.py (1)
deepmd/pt/model/model/dp_model.py (1)
get_fitting_net(50-52)
deepmd/dpmodel/model/property_model.py (5)
deepmd/pt/infer/deep_eval.py (1)
get_var_name(193-200)deepmd/infer/deep_eval.py (1)
get_var_name(324-326)deepmd/pt/model/model/property_model.py (1)
get_var_name(82-84)deepmd/infer/deep_property.py (1)
get_var_name(162-164)deepmd/dpmodel/model/dp_model.py (1)
get_fitting_net(49-51)
source/tests/pt/test_padding_atoms.py (4)
deepmd/dpmodel/model/property_model.py (2)
PropertyModel(20-31)get_var_name(29-31)deepmd/dpmodel/fitting/property_fitting.py (1)
PropertyFittingNet(25-154)source/tests/common/dpmodel/test_padding_atoms.py (3)
TestCaseSingleFrameWithoutNlist(20-46)setUp(21-46)setUp(50-51)source/tests/jax/test_padding_atoms.py (3)
TestCaseSingleFrameWithoutNlist(38-64)setUp(39-64)setUp(72-73)
🪛 Ruff (0.12.2)
deepmd/jax/model/base_model.py
50-50: Use atomic_ret.get("mask", None) instead of an if block
Replace with atomic_ret.get("mask", None)
(SIM401)
deepmd/dpmodel/model/make_model.py
358-358: Use atomic_ret.get("mask", None) instead of an if block
Replace with atomic_ret.get("mask", None)
(SIM401)
🔇 Additional comments (15)
deepmd/dpmodel/model/dp_model.py (1)
49-51: LGTM! Clean encapsulation of fitting network access.The new method provides appropriate access to the fitting network, which is needed for the intensive property handling improvements outlined in the PR objectives.
deepmd/dpmodel/model/property_model.py (1)
29-31: LGTM! Consistent implementation of property name access.The method correctly leverages the new
get_fitting_net()method to access the variable name, maintaining consistency with the pattern established in other model implementations.deepmd/dpmodel/fitting/property_fitting.py (1)
115-127: LGTM! Well-structured output definition for property fitting.The method correctly defines the output variable with appropriate flags, particularly setting
intensive=self.intensivewhich is crucial for the mask-based reduction logic that fixes intensive property prediction with virtual atoms.deepmd/dpmodel/model/transform_output.py (3)
3-5: LGTM!The Optional import is correctly added to support the new mask parameter.
31-31: LGTM!The optional mask parameter is correctly typed and positioned in the function signature.
46-65: Core fix for intensive property calculation with padding atoms looks correct.The implementation correctly handles the masking logic:
- When mask contains zeros (padding atoms), it computes the mean only over unmasked atoms per batch
- Falls back to standard mean calculation when no mask is provided or no padding atoms exist
- Maintains the existing sum reduction for extensive properties
The fix addresses the PR objective of excluding virtual atoms from the denominator in intensive property calculations.
source/tests/common/dpmodel/test_padding_atoms.py (3)
1-18: LGTM!The imports and module setup are appropriate for testing the dpmodel backend with padding atoms.
20-47: Well-structured test data setup.The TestCaseSingleFrameWithoutNlist class provides a solid foundation with:
- Two frames with 3 atoms each
- Proper coordinate and atom type arrays
- Reasonable cutoff parameters
The setup is consistent with similar test files in other backends.
53-100: Comprehensive test coverage for padding atom consistency.The test correctly:
- Verifies that intensive properties match the mean over atoms (lines 71-75)
- Tests padding invariance with different padding sizes (1, 5, 10 atoms)
- Uses proper padding with atom type -1 and zero coordinates
- Asserts numerical consistency within tight tolerance
This effectively validates the fix for intensive property calculation with virtual atoms.
source/tests/pt/test_padding_atoms.py (3)
1-24: LGTM!The imports are appropriate for PyTorch backend testing, including proper tensor conversion utilities.
26-53: Consistent test data setup across backends.The TestCaseSingleFrameWithoutNlist class uses identical test data to the common dpmodel version, ensuring consistent testing across backends.
59-109: PyTorch-specific test implementation looks correct.The test properly:
- Converts NumPy arrays to torch tensors using to_torch_tensor
- Uses PyTorch-specific output naming conventions (atom_abc vs abc)
- Handles tensor conversion back to NumPy with .cpu().detach() for assertions
- Follows the same padding test pattern as other backends
The intensive property verification correctly compares the reduced output with the mean of atomic outputs.
source/tests/jax/test_padding_atoms.py (3)
14-31: Proper JAX version compatibility handling.The conditional imports and version checking ensure tests only run on supported Python versions (3.10+), preventing failures on older systems.
34-65: Consistent test setup with proper JAX version guards.The TestCaseSingleFrameWithoutNlist class maintains identical test data while properly handling JAX version requirements with @unittest.skipIf decorator.
75-124: JAX-specific test implementation is correct.The test properly:
- Uses to_jax_array for tensor conversion
- Follows the same test logic as other backends
- Uses consistent output naming convention (var_name_redu)
- Handles JAX array conversion back to NumPy for assertions
The test coverage is consistent across all three backends (PyTorch, JAX, dpmodel).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR fixes a bug in intensive property prediction when using virtual (padding) atoms. The issue occurs when predicting intensive properties where virtual atoms with property values of 0 should not be included in the mean calculation, but were being included in the denominator, leading to incorrect results.
- Fixed intensive property reduction to exclude padding atoms from mean calculations across PyTorch, JAX, and core modules
- Added methods to retrieve property names and fitting network details from models
- Added comprehensive test coverage to verify model consistency when handling padded atomic data
Reviewed Changes
Copilot reviewed 11 out of 11 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
source/tests/pt/test_padding_atoms.py |
New test file for PyTorch module testing padding atom consistency |
source/tests/jax/test_padding_atoms.py |
New test file for JAX module testing padding atom consistency |
source/tests/common/dpmodel/test_padding_atoms.py |
New test file for core dpmodel testing padding atom consistency |
deepmd/pt/model/model/transform_output.py |
Updated intensive property calculation to exclude padding atoms using mask |
deepmd/pt/model/model/make_model.py |
Pass mask parameter to output transformation |
deepmd/jax/model/base_model.py |
Added intensive property handling with mask for JAX implementation |
deepmd/dpmodel/model/transform_output.py |
Updated core model output transformation with mask support |
deepmd/dpmodel/model/property_model.py |
Added method to get property variable name |
deepmd/dpmodel/model/make_model.py |
Pass mask parameter in forward function |
deepmd/dpmodel/model/dp_model.py |
Added method to get fitting network |
deepmd/dpmodel/fitting/property_fitting.py |
Added output definition method for property fitting |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🧹 Nitpick comments (1)
deepmd/dpmodel/model/transform_output.py (1)
31-32: Document the newmaskparameter in the function docstringThe signature now includes
mask: Optional[np.ndarray] = None, but the docstring doesn’t mention it. Add a brief description (shape expectations and semantics: 1 for real atoms, 0 for virtual/padding).
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
deepmd/dpmodel/model/transform_output.py(3 hunks)deepmd/jax/model/base_model.py(1 hunks)deepmd/pt/model/model/transform_output.py(2 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- deepmd/pt/model/model/transform_output.py
🧰 Additional context used
🪛 Ruff (0.12.2)
deepmd/jax/model/base_model.py
50-50: Use atomic_ret.get("mask", None) instead of an if block
Replace with atomic_ret.get("mask", None)
(SIM401)
…ms (deepmodeling#4869) When using virtual atoms, the property output of virtual atom is `0`. - If predicting energy or other extensive properties, it works well, that's because the virtual atom property `0` do not contribute to the total energy or other extensive properties. - However, if predicting intensive properties, there is some error. For example, a frame has two real atoms and two virtual atoms, the atomic property contribution is [2, 2, 0, 0](the atomic property of virtual atoms are always 0), the final property should be `(2+2)/real_atoms = 2`, not be `(2+2)/total_atoms =1`. This PR is used to solve this bug mentioned above. <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit * **New Features** * Models now provide accessors to retrieve property names and their fitting network; property fitting nets expose output definitions. * **Bug Fixes** * Intensive property reduction respects atom masks so padded/dummy atoms are ignored, keeping results invariant to padding. * **Tests** * Added PyTorch, JAX, and core tests validating consistent behavior with padded atoms. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
When using virtual atoms, the property output of virtual atom is
0.0do not contribute to the total energy or other extensive properties.(2+2)/real_atoms = 2, not be(2+2)/total_atoms =1.This PR is used to solve this bug mentioned above.
Summary by CodeRabbit
New Features
Bug Fixes
Tests