Skip to content

Lazily create tensors in optim_baseline#12301

Closed
goldsborough wants to merge 1 commit intopytorch:masterfrom
goldsborough:fix-optim-baseline
Closed

Lazily create tensors in optim_baseline#12301
goldsborough wants to merge 1 commit intopytorch:masterfrom
goldsborough:fix-optim-baseline

Conversation

@goldsborough
Copy link
Contributor

Tensors cannot be created globally because of static initialization order issues. So tensors for the optim_baseline test must be created lazily instead. This is fine because these functions will only be called once (in the respective test).

@ezyang

Copy link
Contributor

@ezyang ezyang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be nice if these numbers weren't checked in as source code.

FYI: numbers wobbled on regeneration.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

goldsborough is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants