Revert D45387167: Multisect successfully blamed D45387167 for test or build failures#100424
Revert D45387167: Multisect successfully blamed D45387167 for test or build failures#100424s4ayub wants to merge 1 commit intopytorch:mainfrom s4ayub:export-D45448312
Conversation
… build failures Summary: This diff is reverting D45387167 D45387167: Basic dynamo support for traceable collectives (#94440) by wconstab has been identified to be causing the following test or build failures: Tests affected: - [hpc/torchrec/models/examples/gpu_tests:sparsenn_predictor_test - test_predict (hpc.torchrec.models.examples.gpu_tests.sparsenn_predictor_test.SparseNNPredictorTest)](https://www.internalfb.com/intern/test/844425007919303/) Here's the Multisect link: https://www.internalfb.com/multisect/1966835 Here are the tasks that are relevant to this breakage: We're generating a revert to back out the changes in this diff, please note the backout may land if someone accepts it. If you believe this diff has been generated in error you may Commandeer and Abandon it. Test Plan: NA Reviewed By: s4ayub Differential Revision: D45448312 fbshipit-source-id: d6bbb702db041e6f9a301e2721b65abec4436db2
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/100424
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Merge Blocking SEVsThere is 1 active merge blocking SEVs. Please view them below:
If you must merge, use ❌ 3 New Failures, 1 Unrelated FailureAs of commit 6e0db1d: NEW FAILURES - The following jobs have failed:
BROKEN TRUNK - The following job failed but were present on the merge base e88e92e:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
|
|
This pull request was exported from Phabricator. Differential Revision: D45448312 |
|
any chance we can do a forward fix? i think there is a try/except with checking for deploy that would work. also, we need to get oss CI coverage for this case. what will that take? do we need to just run something with functional collectives inside torchdeploy? |
|
@pytorchbot merge -f 'Landed internally'nn(Initiating merge automatically since Phabricator Diff has merged, using force because this PR might not pass merge_rules.json but landed internally) |
|
❌ 🤖 pytorchbot command failed: Try |
|
@pytorchbot merge -f 'Landed internally' |
Merge startedYour change will be merged immediately since you used the force (-f) flag, bypassing any CI checks (ETA: 1-5 minutes). Learn more about merging in the wiki. Questions? Feedback? Please reach out to the PyTorch DevX Team |
Summary:
This diff is reverting D45387167
D45387167: Basic dynamo support for traceable collectives (#94440) by wconstab has been identified to be causing the following test or build failures (internal)
If you believe this diff has been generated in error you may Commandeer and Abandon it.
Test Plan: NA
Reviewed By: s4ayub
Differential Revision: D45448312
cc @soumith @voznesenskym @penguinwu @anijain2305 @EikanWang @jgong5 @Guobing-Chen @XiaobingSuper @zhuhaozhe @blzheng @Xia-Weiwen @wenzhe-nrv @jiayisunx @desertfire