Skip to content

relax tolerance for two torch.half (float16) tests#10519

Closed
hartb wants to merge 1 commit intopytorch:masterfrom
hartb:hartb-7420
Closed

relax tolerance for two torch.half (float16) tests#10519
hartb wants to merge 1 commit intopytorch:masterfrom
hartb:hartb-7420

Conversation

@hartb
Copy link
Contributor

@hartb hartb commented Aug 14, 2018

Two tests in the 'nn' test bucket may fail when the torch.half
(float16) data type is used. The assertions used in the tests
intend to allow slight floating point imprecision in the results,
but the tolerances used for the comparisons are too strict for
the half type.

Relax the tolerances so that slight float16 imprecision won't
cause test failures.

The affected tests are:

  • test_variable_sequence_cuda
  • test_Conv2d_groups_nobias

For more information, see issue:

#7420

Two tests in the 'nn' test bucket may fail when the torch.half
(float16) data type is used. The assertions used in the tests
intend to allow slight floating point imprecision in the results,
but the tolerances used for the comparisons are too strict for
the half type.

Relax the tolerances so that slight float16 imprecision won't
cause test failures.

The affected tests are:

- test_variable_sequence_cuda
- test_Conv2d_groups_nobias

For more information, see issue:

pytorch#7420
@hartb
Copy link
Contributor Author

hartb commented Aug 14, 2018

A proposed change to address #7420

@hartb
Copy link
Contributor Author

hartb commented Aug 15, 2018

The Windows CI build for this failed at:

22:11:16          C:\Jenkins\workspace\caffe2-builds\py2-cuda9.0-cudnn7-windows-build\caffe2/
operators/fused_rowwise_random_quantization_ops.h(215):
error C2398: Element '2': conversion from 'const std::size_t' to '__int64' requires a narrowing
conversion (compiling source file C:\Jenkins\workspace\caffe2-builds\py2-cuda9.0-cudnn7-windows-build\caffe2\operators\fused_rowwise_random_quantization_ops.cc)
[C:\Jenkins\workspace\caffe2-builds\py2-cuda9.0-cudnn7-windows-build\build\caffe2\caffe2.vcxproj]

Which seems unrelated to the change in the PR. A quick search didn't turn up any matching issues.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

soumith is landing this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants