Skip to content

Use torch.set_default_dtype in test_data_parallel and rename dtype2prec#32962

Closed
pritamdamania87 wants to merge 2 commits intogh/pritamdamania87/94/basefrom
gh/pritamdamania87/94/head
Closed

Use torch.set_default_dtype in test_data_parallel and rename dtype2prec#32962
pritamdamania87 wants to merge 2 commits intogh/pritamdamania87/94/basefrom
gh/pritamdamania87/94/head

Conversation

@pritamdamania87
Copy link
Contributor

@pritamdamania87 pritamdamania87 commented Feb 4, 2020

Stack from ghstack:

As per @gchanan's comments on
#30445, I've used
torch.set_default_dtype in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE

Differential Revision: D19714374

As per @gchanan's comments on
#30445, I've used
`torch.set_default_dtype` in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE

Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/)

[ghstack-poisoned]
pritamdamania87 pushed a commit that referenced this pull request Feb 4, 2020
As per @gchanan's comments on
#30445, I've used
`torch.set_default_dtype` in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE

Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/)

ghstack-source-id: 97676959
Pull Request resolved: #32962

dtype2prec = {torch.float: 1e-5,
# Using @precisionOverride is the recommended way instead of this.
dtype2prec_DONTUSE = {torch.float: 1e-5,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you fix the indent



dtype2prec = {torch.float: 1e-5,
# Using @precisionOverride is the recommended way instead of this.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

maybe add a bit more context: using precisionOverride specific to your test is the recommended way of doing this. These are just some values that worked for test_nn.

Copy link
Contributor

@gchanan gchanan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

address the comments then looks good to go.

…me dtype2prec"

As per @gchanan's comments on
#30445, I've used
`torch.set_default_dtype` in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE

Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/)

[ghstack-poisoned]
pritamdamania87 pushed a commit that referenced this pull request Feb 15, 2020
Pull Request resolved: #32962

As per @gchanan's comments on
#30445, I've used
`torch.set_default_dtype` in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE
ghstack-source-id: 98388429

Differential Revision: [D19714374](https://our.internmc.facebook.com/intern/diff/D19714374/)
@dr-ci
Copy link

dr-ci bot commented Feb 15, 2020

💊 CircleCI build failures summary and remediations

As of commit e095808:

None of the build failures appear to be your fault.

  • 3/3 broken upstream at merge base 3359871 since Feb 14

    Please rebase on the viable/strict branch (expand for instructions)

    If your commit is newer than viable/strict, you can try basing on an older, stable commit:

    git fetch origin viable/strict
    git rebase --onto viable/strict $(git merge-base origin/master HEAD)
    

    If your commit is older than viable/strict:

    git fetch origin viable/strict
    git rebase viable/strict
    

    Check out the recency history of this "viable master" tracking branch.

Detailed failure analysis

One may explore the probable reasons each build failed interactively on the Dr. CI website.

🚧 3 upstream failures recognized by patterns:

These builds matched patterns, but were probably caused by upstream breakages:


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker.

This comment has been revised 2 times.

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in fd684cc.

@facebook-github-bot facebook-github-bot deleted the gh/pritamdamania87/94/head branch February 20, 2020 15:16
ttumiel pushed a commit to ttumiel/pytorch that referenced this pull request Mar 4, 2020
…ec (pytorch#32962)

Summary:
Pull Request resolved: pytorch#32962

As per gchanan's comments on
pytorch#30445, I've used
`torch.set_default_dtype` in test_data_parallel instead of specifying
dtype=torch.double everywhere. Also, renamed dtype2prec to dtype2prec_DONTUSE
ghstack-source-id: 98388429

Test Plan: waitforbuildbot

Differential Revision: D19714374

fbshipit-source-id: eb55bbca33881625636ba9ea6dd4cb692f25668e
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants