Skip to content

Add 1-layer gradient accumulation test to check aliasing#7692

Merged
JackCaoG merged 3 commits intomasterfrom
jeffhataws_add_gradaccum_test
Aug 1, 2024
Merged

Add 1-layer gradient accumulation test to check aliasing#7692
JackCaoG merged 3 commits intomasterfrom
jeffhataws_add_gradaccum_test

Conversation

@jeffhataws
Copy link
Copy Markdown
Collaborator

Add 1-layer gradient accumulation test to check aliasing for #7174.

Currently test is skipped. Once #7174 is fixed we can unskip.

If unskip now, you will get:

======================================================================
FAIL: test_grad_accum (__main__.InputOutputAliasesTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/ubuntu/pytorch/xla/test/test_input_output_aliases.py", line 125, in test_grad_accum
    assert (alias_count == 1.0), f"Expect 1 input-output alias pair for gradient accumulation, got {alias_count}"
AssertionError: Expect 1 input-output alias pair for gradient accumulation, got 0.0

@jeffhataws jeffhataws requested a review from JackCaoG July 18, 2024 04:39
@JackCaoG
Copy link
Copy Markdown
Collaborator

Thanks Jeff, I think it is better to fix the issue first and then merge this pr.

Comment on lines +127 to +132
assert (
graph_count == 2
), f"Expect 2 graphs for gradient accumulation test, got {graph_count}"
assert (
alias_count == 1.0
), f"Expect 1 input-output alias pair for gradient accumulation, got {alias_count}"
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit can we use self.assertEqual? You can fix that in a follow up pr.

@JackCaoG JackCaoG merged commit a5520e5 into master Aug 1, 2024
@JackCaoG JackCaoG deleted the jeffhataws_add_gradaccum_test branch August 1, 2024 23:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants