Continue support sharding pipes in tud.datapipes.iter.grouping as deprecated#94527
Continue support sharding pipes in tud.datapipes.iter.grouping as deprecated#94527wenleix wants to merge 1 commit intopytorch:masterfrom
tud.datapipes.iter.grouping as deprecated#94527Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/94527
Note: Links to docs will display an error until the docs builds have been completed. ❗ 2 Active SEVsThere are 2 currently active SEVs. If your PR is affected, please view them below: ✅ No FailuresAs of commit 3a62fd5: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
1 similar comment
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
2888d96 to
975ca2e
Compare
There was a problem hiding this comment.
Can we add one more comment saying it's going to be removed in PyTorch 2.1?
There was a problem hiding this comment.
nit: I notice that TorchData imports from torch.utils.data.datapipes.iter for ShardingFilter and torch.utils.data.datapipes.iter.sharding for SHARDING_PRIORITIES.
Should we have separate comments for each name?
There was a problem hiding this comment.
I notice that TorchData imports from torch.utils.data.datapipes.iter for ShardingFilter and torch.utils.data.datapipes.iter.sharding for SHARDING_PRIORITIES.
Good point... I think tudd.iter.ShardingFilter is preferred over tudd.iter.sharding.ShardingFilterIterDataPipe.
But if user already used tudd.iter.grouping.ShardingFilterIterDataPipe, perhaps the immediate rerouting would be tudd.iter.sharding.ShardingFilterIterDataPipe ~ (and we should have seperate guide to route user from tudd.iter.sharding.ShardingFilterIterDataPipe to tudd.iter.ShardingFilter
There was a problem hiding this comment.
I think we should do two warnings here because users probably won't be happy to see another warning after changing the code to the proposed fix.
There was a problem hiding this comment.
Do we have a guideline on this? Since tudd.iter.sharding.ShardingFilterIterDataPipe is also a public API. While we encourage/promote to use tudd.iter.ShardingFilter, both should be supported.
If we don't want to support tudd.iter.sharding.ShardingFilterIterDataPipe (and should always use the tud.iter version, shall make it explicitly make it like tudd.iter.sharding._ShardingFilterIterDataPipe? (Similar question applies to other DataPipes under tudd.iter.XXX?
test/test_datapipe.py
Outdated
There was a problem hiding this comment.
Should we add a assertWarning at import?
And, other tests can be removed as they just references to the right classes.
There was a problem hiding this comment.
And, other tests can be removed as they just references to the right classes.
discussed offline, just use them as some smoke integration test~
975ca2e to
e1e8115
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
NivekT
left a comment
There was a problem hiding this comment.
Didn't realize __getattr__ worked for modules. LGTM after resolving the comments!
There was a problem hiding this comment.
nit: I notice that TorchData imports from torch.utils.data.datapipes.iter for ShardingFilter and torch.utils.data.datapipes.iter.sharding for SHARDING_PRIORITIES.
Should we have separate comments for each name?
PEP-562: https://peps.python.org/pep-0562/ ;) . |
e1e8115 to
f59207a
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
f59207a to
1320492
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
1320492 to
b7083bd
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
b7083bd to
fc9c477
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
|
@pytorchbot merge |
Merge startedYour change will be merged once all checks pass (ETA 0-4 Hours). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Merge failedReason: 1 mandatory check(s) failed (Rule Dig deeper by viewing the failures on hud Details for Dev Infra teamRaised by workflow job |
fc9c477 to
e8cd848
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
e8cd848 to
1d5264c
Compare
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
…eprecated (pytorch#94527) Summary: Pull Request resolved: pytorch#94527 pytorch#94095 moves this into `tud.datapipes.iter.sharding`. However, since previously this is a public API, this is a BC break change. As discussed in meta-pytorch/data#987 (comment), we will have backward compatbile support but with deprecated warning. Test Plan: ``` buck test mode/opt //caffe2/test:datapipe ``` Reviewed By: ejguan Differential Revision: D43161015 fbshipit-source-id: 82049979981b83aefd247843fbc2c004a7e7a90a
|
This pull request was exported from Phabricator. Differential Revision: D43161015 |
1d5264c to
3a62fd5
Compare
|
@pytorchbot merge -f 'Landed internally' (Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
1 similar comment
|
@pytorchbot merge -f 'Landed internally' (Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
#94095 moves this into
tud.datapipes.iter.sharding. However, since previously this is a public API, this is a BC break change.As discussed in meta-pytorch/data#987 (comment), we will have backward compatbile support but with deprecated warning.
Differential Revision: D43161015
Deprecation:
Please import from
SHARDING_PRIORITIESfromtorch.utils.data.datapipes.iter.shardinginstead oftorch.utils.data.datapipes.iter.grouping.