Skip to content

[Gradient Compression] Typo fixes in PowerSGD#50974

Closed
wayi1 wants to merge 3 commits intogh/SciPioneer/46/basefrom
gh/SciPioneer/46/head
Closed

[Gradient Compression] Typo fixes in PowerSGD#50974
wayi1 wants to merge 3 commits intogh/SciPioneer/46/basefrom
gh/SciPioneer/46/head

Conversation

@wayi1
Copy link
Copy Markdown
Contributor

@wayi1 wayi1 commented Jan 23, 2021

Stack from ghstack:

Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202

Differential Revision: D26031679

Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202

Differential Revision: [D26031679](https://our.internmc.facebook.com/intern/diff/D26031679/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Jan 23, 2021

💊 CI failures summary and remediations

As of commit f9a7b45 (more details on the Dr. CI page):


  • 2/3 failures possibly* introduced in this PR
    • 2/2 non-CircleCI failure(s)
  • 1/3 broken upstream at merge base 61f32b6 on Jan 22 from 1:57pm to 5:49pm

1 job timed out:

  • pytorch_windows_vs2019_py36_cuda10.1_test2

🚧 1 fixed upstream failure:

These were probably caused by upstream breakages that were already fixed.

Please rebase on the viable/strict branch (expand for instructions)

If your commit is older than viable/strict, run these commands:

git fetch https://github.com/pytorch/pytorch viable/strict
git rebase FETCH_HEAD

Check out the recency history of this "viable master" tracking branch.


ci.pytorch.org: 1 failed


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

@facebook-github-bot facebook-github-bot added cla signed oncall: distributed Add this issue/PR to distributed oncall triage queue labels Jan 23, 2021
wayi1 pushed a commit that referenced this pull request Jan 23, 2021
Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202

Differential Revision: [D26031679](https://our.internmc.facebook.com/intern/diff/D26031679/)

ghstack-source-id: 120246319
Pull Request resolved: #50974
Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202

Differential Revision: [D26031679](https://our.internmc.facebook.com/intern/diff/D26031679/)

[ghstack-poisoned]
wayi1 pushed a commit that referenced this pull request Jan 23, 2021
Pull Request resolved: #50974

Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202
ghstack-source-id: 120257221

Differential Revision: [D26031679](https://our.internmc.facebook.com/intern/diff/D26031679/)
Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression #47202

Differential Revision: [D26031679](https://our.internmc.facebook.com/intern/diff/D26031679/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

This pull request has been merged in 9f19843.

@facebook-github-bot facebook-github-bot deleted the gh/SciPioneer/46/head branch January 29, 2021 15:21
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Pull Request resolved: pytorch#50974

Typo fixes.

Original PR issue: Investigate Applying PowerSGD to Communication Hook for Gradient Compression pytorch#47202
ghstack-source-id: 120257221

Test Plan: N/A

Reviewed By: rohan-varma

Differential Revision: D26031679

fbshipit-source-id: 9d049b50419a3e40e53f7f1275a441e31b87717b
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged oncall: distributed Add this issue/PR to distributed oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants