Skip to content

[optimizer] refactor AdamW to use functional API#50411

Closed
wanchaol wants to merge 11 commits intogh/wanchaol/151/basefrom
gh/wanchaol/151/head
Closed

[optimizer] refactor AdamW to use functional API#50411
wanchaol wants to merge 11 commits intogh/wanchaol/151/basefrom
gh/wanchaol/151/head

Conversation

@wanchaol
Copy link
Copy Markdown
Collaborator

@wanchaol wanchaol commented Jan 12, 2021

Stack from ghstack:

Differential Revision: D25932776

@facebook-github-bot
Copy link
Copy Markdown
Contributor

facebook-github-bot commented Jan 12, 2021

💊 CI failures summary and remediations

As of commit 78fd5fa (more details on the Dr. CI page):


  • 5/5 failures introduced in this PR

🕵️ 5 new failures recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_bionic_py3_6_clang9_test (1/5)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jan 12 02:04:03 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]" [arg-type]
Jan 12 02:03:38   test_run_mypy (__main__.TestTypeHints) ... FAIL (61.448s)
Jan 12 02:03:41   test_run_mypy_strict (__main__.TestTypeHints) ... ok (3.080s)
Jan 12 02:04:03   test_type_hint_examples (__main__.TestTypeHints) ... ok (21.234s)
Jan 12 02:04:03 
Jan 12 02:04:03 ======================================================================
Jan 12 02:04:03 FAIL [61.448s]: test_run_mypy (__main__.TestTypeHints)
Jan 12 02:04:03 ----------------------------------------------------------------------
Jan 12 02:04:03 Traceback (most recent call last):
Jan 12 02:04:03   File "test_type_hints.py", line 214, in test_run_mypy
Jan 12 02:04:03     self.fail(f"mypy failed: {stdout} {stderr}")
Jan 12 02:04:03 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]"  [arg-type]
Jan 12 02:04:03 torch/optim/functional.py:174: error: Incompatible types in assignment (expression has type "Optional[Tensor]", variable has type "Tensor")  [assignment]
Jan 12 02:04:03 torch/distributed/optim/functional_adam.py:61: error: Need type annotation for 'state_sums' (hint: "state_sums: List[<type>] = ...")  [var-annotated]
Jan 12 02:04:03 Found 3 errors in 2 files (checked 1189 source files)
Jan 12 02:04:03  
Jan 12 02:04:03 
Jan 12 02:04:03 ----------------------------------------------------------------------
Jan 12 02:04:03 Ran 4 tests in 96.871s
Jan 12 02:04:03 
Jan 12 02:04:03 FAILED (failures=1)
Jan 12 02:04:03 

See CircleCI build pytorch_linux_bionic_py3_8_gcc9_coverage_test1 (2/5)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jan 12 02:12:36 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]" [arg-type]
Jan 12 02:12:11   test_type_hint_examples (__main__.TestTypeHints)
Jan 12 02:12:36 Runs mypy over all the test examples present in ... ok (24.544s)
Jan 12 02:12:36 
Jan 12 02:12:36 ======================================================================
Jan 12 02:12:36 FAIL [83.121s]: test_run_mypy (__main__.TestTypeHints)
Jan 12 02:12:36 Runs mypy over all files specified in mypy.ini
Jan 12 02:12:36 ----------------------------------------------------------------------
Jan 12 02:12:36 Traceback (most recent call last):
Jan 12 02:12:36   File "test_type_hints.py", line 214, in test_run_mypy
Jan 12 02:12:36     self.fail(f"mypy failed: {stdout} {stderr}")
Jan 12 02:12:36 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]"  [arg-type]
Jan 12 02:12:36 torch/optim/functional.py:174: error: Incompatible types in assignment (expression has type "Optional[Tensor]", variable has type "Tensor")  [assignment]
Jan 12 02:12:36 torch/distributed/optim/functional_adam.py:61: error: Need type annotation for 'state_sums' (hint: "state_sums: List[<type>] = ...")  [var-annotated]
Jan 12 02:12:36 Found 3 errors in 2 files (checked 1189 source files)
Jan 12 02:12:36  
Jan 12 02:12:36 
Jan 12 02:12:36 ----------------------------------------------------------------------
Jan 12 02:12:36 Ran 4 tests in 126.389s
Jan 12 02:12:36 
Jan 12 02:12:36 FAILED (failures=1)
Jan 12 02:12:36 

See CircleCI build pytorch_linux_xenial_py3_clang5_asan_test1 (3/5)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jan 12 02:08:52 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]" [arg-type]
Jan 12 02:08:25   test_run_mypy (__main__.TestTypeHints) ... FAIL (68.991s)
Jan 12 02:08:29   test_run_mypy_strict (__main__.TestTypeHints) ... ok (3.527s)
Jan 12 02:08:52   test_type_hint_examples (__main__.TestTypeHints) ... ok (23.860s)
Jan 12 02:08:52 
Jan 12 02:08:52 ======================================================================
Jan 12 02:08:52 FAIL [68.991s]: test_run_mypy (__main__.TestTypeHints)
Jan 12 02:08:52 ----------------------------------------------------------------------
Jan 12 02:08:52 Traceback (most recent call last):
Jan 12 02:08:52   File "test_type_hints.py", line 214, in test_run_mypy
Jan 12 02:08:52     self.fail(f"mypy failed: {stdout} {stderr}")
Jan 12 02:08:52 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]"  [arg-type]
Jan 12 02:08:52 torch/optim/functional.py:174: error: Incompatible types in assignment (expression has type "Optional[Tensor]", variable has type "Tensor")  [assignment]
Jan 12 02:08:52 torch/distributed/optim/functional_adam.py:61: error: Need type annotation for 'state_sums' (hint: "state_sums: List[<type>] = ...")  [var-annotated]
Jan 12 02:08:52 Found 3 errors in 2 files (checked 1189 source files)
Jan 12 02:08:52  
Jan 12 02:08:52 
Jan 12 02:08:52 ----------------------------------------------------------------------
Jan 12 02:08:52 Ran 4 tests in 109.169s
Jan 12 02:08:52 
Jan 12 02:08:52 FAILED (failures=1)
Jan 12 02:08:52 

See CircleCI build pytorch_windows_vs2019_py36_cuda10.1_test2 (4/5)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

AssertionError: False is not true : Tensors failed to compare as equal!With rtol=1e-07 and atol=1e-07, found 6 element(s) (out of 6) whose difference(s) exceeded the margin of error (including 0 nan comparisons). The greatest difference was 0.14733898893893732 (0.020583703180005117 vs. 0.16792269211894245), which occurred at index (2, 1).
======================================================================
FAIL [0.291s]: test_multi_tensor_optimizers (__main__.TestOptim)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "C:\Users\circleci\project\build\win_tmp\build\torch\testing\_internal\common_utils.py", line 423, in wrapper
    fn(*args, **kwargs)
  File "test_optim.py", line 380, in test_multi_tensor_optimizers
    self.assertEqual(p1, p2)
  File "C:\Users\circleci\project\build\win_tmp\build\torch\testing\_internal\common_utils.py", line 1180, in assertEqual
    super().assertTrue(result, msg=self._get_assert_msg(msg, debug_msg=debug_msg))
AssertionError: False is not true : Tensors failed to compare as equal!With rtol=1e-07 and atol=1e-07, found 6 element(s) (out of 6) whose difference(s) exceeded the margin of error (including 0 nan comparisons). The greatest difference was 0.14733898893893732 (0.020583703180005117 vs. 0.16792269211894245), which occurred at index (2, 1).

----------------------------------------------------------------------
Ran 104 tests in 59.757s

FAILED (failures=1)

Generating XML reports...
Generated XML report: test-reports\dist-gloo\TEST-TestLRScheduler-20210112032935.xml
Generated XML report: test-reports\dist-gloo\TEST-TestOptim-20210112032935.xml
Generated XML report: test-reports\dist-gloo\TEST-TestSWAUtils-20210112032935.xml

See CircleCI build pytorch_linux_xenial_cuda10_2_cudnn7_py3_gcc7_test1 (5/5)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jan 12 02:24:32 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]" [arg-type]
Jan 12 02:24:11   test_run_mypy (__main__.TestTypeHints) ... FAIL (62.749s)
Jan 12 02:24:14   test_run_mypy_strict (__main__.TestTypeHints) ... ok (2.933s)
Jan 12 02:24:32   test_type_hint_examples (__main__.TestTypeHints) ... ok (18.634s)
Jan 12 02:24:32 
Jan 12 02:24:32 ======================================================================
Jan 12 02:24:32 FAIL [62.749s]: test_run_mypy (__main__.TestTypeHints)
Jan 12 02:24:32 ----------------------------------------------------------------------
Jan 12 02:24:32 Traceback (most recent call last):
Jan 12 02:24:32   File "test_type_hints.py", line 214, in test_run_mypy
Jan 12 02:24:32     self.fail(f"mypy failed: {stdout} {stderr}")
Jan 12 02:24:32 AssertionError: mypy failed: torch/optim/functional.py:172: error: Argument 1 to "add" of "_TensorBase" has incompatible type "Optional[Tensor]"; expected "Union[Tensor, Union[int, float, bool]]"  [arg-type]
Jan 12 02:24:32 torch/optim/functional.py:174: error: Incompatible types in assignment (expression has type "Optional[Tensor]", variable has type "Tensor")  [assignment]
Jan 12 02:24:32 torch/distributed/optim/functional_adam.py:61: error: Need type annotation for 'state_sums' (hint: "state_sums: List[<type>] = ...")  [var-annotated]
Jan 12 02:24:32 Found 3 errors in 2 files (checked 1189 source files)
Jan 12 02:24:32  
Jan 12 02:24:32 
Jan 12 02:24:32 ----------------------------------------------------------------------
Jan 12 02:24:32 Ran 4 tests in 95.497s
Jan 12 02:24:32 
Jan 12 02:24:32 FAILED (failures=1)
Jan 12 02:24:32 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

This comment has been revised 8 times.

wanchaol added a commit that referenced this pull request Jan 12, 2021
ghstack-source-id: 3e1bd7b
Pull Request resolved: #50411
wanchaol added a commit that referenced this pull request Jan 15, 2021
ghstack-source-id: f2a394d
Pull Request resolved: #50411
wanchaol added a commit that referenced this pull request Jan 15, 2021
ghstack-source-id: edf1417
Pull Request resolved: #50411
@codecov
Copy link
Copy Markdown

codecov Bot commented Jan 21, 2021

Codecov Report

Merging #50411 (6cc429d) into gh/wanchaol/151/base (044617a) will increase coverage by 0.00%.
The diff coverage is 100.00%.

@@                  Coverage Diff                  @@
##           gh/wanchaol/151/base   #50411   +/-   ##
=====================================================
  Coverage                 81.02%   81.02%           
=====================================================
  Files                      1916     1916           
  Lines                    209328   209345   +17     
=====================================================
+ Hits                     169600   169618   +18     
+ Misses                    39728    39727    -1     

@facebook-github-bot
Copy link
Copy Markdown
Contributor

@wanchaol merged this pull request in df96344.

@facebook-github-bot facebook-github-bot deleted the gh/wanchaol/151/head branch January 25, 2021 15:19
wanchaol pushed a commit to wanchaol/pytorch that referenced this pull request Mar 3, 2021
Summary: This fix the bug introduced during refactoring optimizers pytorch#50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined.

Reviewed By: ngimel

Differential Revision: D26699827

fbshipit-source-id: fbe58186ae17e131abf6cdaf6eabbe32e351fe94
facebook-github-bot pushed a commit that referenced this pull request Mar 3, 2021
Summary:
Pull Request resolved: #52944

This fix the bug introduced during refactoring optimizers #50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined.

Reviewed By: ngimel

Differential Revision: D26699827

fbshipit-source-id: 8a7074127704c7a4a1fbc17d48a81e23a649f280
aocsa pushed a commit to Quansight/pytorch that referenced this pull request Mar 15, 2021
Summary:
Pull Request resolved: pytorch#52944

This fix the bug introduced during refactoring optimizers pytorch#50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined.

Reviewed By: ngimel

Differential Revision: D26699827

fbshipit-source-id: 8a7074127704c7a4a1fbc17d48a81e23a649f280
xsacha pushed a commit to xsacha/pytorch that referenced this pull request Mar 31, 2021
Summary:
Pull Request resolved: pytorch#52944

This fix the bug introduced during refactoring optimizers pytorch#50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined.

Reviewed By: ngimel

Differential Revision: D26699827

fbshipit-source-id: 8a7074127704c7a4a1fbc17d48a81e23a649f280
jasperzhong pushed a commit to jasperzhong/swift that referenced this pull request Nov 25, 2021
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary: Pull Request resolved: pytorch#50411

Test Plan: Imported from OSS

Reviewed By: izdeby

Differential Revision: D25932776

Pulled By: wanchaol

fbshipit-source-id: e8e1696b3390ba7909b36fd0107c58b892520432
laurentdupin pushed a commit to laurentdupin/pytorch that referenced this pull request Apr 24, 2026
Summary:
Pull Request resolved: pytorch#52944

This fix the bug introduced during refactoring optimizers pytorch#50411. When all parameters have no grads, we should still allows `beta` like hyper params to be defined.

Reviewed By: ngimel

Differential Revision: D26699827

fbshipit-source-id: 8a7074127704c7a4a1fbc17d48a81e23a649f280
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants