Skip to content

Rename positional and kwarg_only to have flat prefix#49042

Closed
ezyang wants to merge 3 commits intogh/ezyang/887/basefrom
gh/ezyang/887/head
Closed

Rename positional and kwarg_only to have flat prefix#49042
ezyang wants to merge 3 commits intogh/ezyang/887/basefrom
gh/ezyang/887/head

Conversation

@ezyang
Copy link
Copy Markdown
Contributor

@ezyang ezyang commented Dec 8, 2020

Stack from ghstack:

I want the names positional and kwarg_only to give the unflat
representation (e.g., preserving TensorOptionsArguments in the
returned Union). So I regret my original naming choice when
I moved grouping to model. This renames them to have flat_ prefix
and also adds a flat_non_out argument for cases where you just
want to look at non-out arguments.

Signed-off-by: Edward Z. Yang ezyang@fb.com

Differential Revision: D25455884

I want the names positional and kwarg_only to give the unflat
representation (e.g., preserving TensorOptionsArguments in the
returned Union).  So I regret my original naming choice when
I moved grouping to model.  This renames them to have flat_ prefix
and also adds a flat_non_out argument for cases where you just
want to look at non-out arguments.

Signed-off-by: Edward Z. Yang <ezyang@fb.com>

[ghstack-poisoned]
@dr-ci
Copy link
Copy Markdown

dr-ci bot commented Dec 8, 2020

💊 CI failures summary and remediations

As of commit 296fc8f (more details on the Dr. CI page):


  • 4/4 failures introduced in this PR

🕵️ 4 new failures recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_linux_xenial_cuda9_2_cudnn7_py3_gcc5_4_build (1/4)

Step: "Build" (full log | diagnosis details | 🔁 rerun)

Dec 10 02:48:38 sccache: error: couldn't connect to server
Dec 10 02:48:38 +++ eval 'extract_trap_cmd ' 
Dec 10 02:48:38 ++++ extract_trap_cmd 
Dec 10 02:48:38 ++++ printf '%s\n' '' 
Dec 10 02:48:38 +++ printf '%s\n' cleanup 
Dec 10 02:48:38 ++ trap -- ' 
Dec 10 02:48:38 cleanup' EXIT 
Dec 10 02:48:38 ++ [[ pytorch-linux-xenial-cuda9.2-cudnn7-py3-gcc5.4-build != *pytorch-win-* ]] 
Dec 10 02:48:38 ++ which sccache 
Dec 10 02:48:38 ++ sccache --stop-server 
Dec 10 02:48:38 Stopping sccache server... 
Dec 10 02:48:38 sccache: error: couldn't connect to server 
Dec 10 02:48:38 sccache: caused by: Connection refused (os error 111) 
Dec 10 02:48:38 ++ true 
Dec 10 02:48:38 ++ rm /var/lib/jenkins/sccache_error.log 
Dec 10 02:48:38 rm: cannot remove '/var/lib/jenkins/sccache_error.log': No such file or directory 
Dec 10 02:48:38 ++ true 
Dec 10 02:48:38 ++ [[ pytorch-linux-xenial-cuda9.2-cudnn7-py3-gcc5.4-build == *rocm* ]] 
Dec 10 02:48:38 ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 
Dec 10 02:48:38 ++ SCCACHE_IDLE_TIMEOUT=1200 
Dec 10 02:48:38 ++ RUST_LOG=sccache::server=error 
Dec 10 02:48:38 ++ sccache --start-server 

See CircleCI build pytorch_linux_xenial_py3_6_gcc5_4_build (2/4)

Step: "Build" (full log | diagnosis details | 🔁 rerun)

Dec 10 02:46:16 sccache: error: couldn't connect to server
Dec 10 02:46:16 +++ eval 'extract_trap_cmd ' 
Dec 10 02:46:16 ++++ extract_trap_cmd 
Dec 10 02:46:16 ++++ printf '%s\n' '' 
Dec 10 02:46:16 +++ printf '%s\n' cleanup 
Dec 10 02:46:16 ++ trap -- ' 
Dec 10 02:46:16 cleanup' EXIT 
Dec 10 02:46:16 ++ [[ pytorch-linux-xenial-py3.6-gcc5.4-build != *pytorch-win-* ]] 
Dec 10 02:46:16 ++ which sccache 
Dec 10 02:46:16 ++ sccache --stop-server 
Dec 10 02:46:16 Stopping sccache server... 
Dec 10 02:46:16 sccache: error: couldn't connect to server 
Dec 10 02:46:16 sccache: caused by: Connection refused (os error 111) 
Dec 10 02:46:16 ++ true 
Dec 10 02:46:16 ++ rm /var/lib/jenkins/sccache_error.log 
Dec 10 02:46:16 rm: cannot remove '/var/lib/jenkins/sccache_error.log': No such file or directory 
Dec 10 02:46:16 ++ true 
Dec 10 02:46:16 ++ [[ pytorch-linux-xenial-py3.6-gcc5.4-build == *rocm* ]] 
Dec 10 02:46:16 ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 
Dec 10 02:46:16 ++ SCCACHE_IDLE_TIMEOUT=1200 
Dec 10 02:46:16 ++ RUST_LOG=sccache::server=error 
Dec 10 02:46:16 ++ sccache --start-server 

See CircleCI build pytorch_linux_xenial_py3_clang7_onnx_ort_test2 (3/4)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_mask_rcnn FAILED [ 70%]
Dec 10 03:52:35 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_no_initial_state_without_sequence_lengths_with_dropout PASSED [ 68%] 
Dec 10 03:52:35 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_no_initial_state_without_sequence_lengths_without_dropout PASSED [ 69%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_with_batch_first_sequence_lengths_with_dropout PASSED [ 69%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_with_batch_first_sequence_lengths_without_dropout PASSED [ 69%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_with_variable_length_sequences_with_dropout PASSED [ 69%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_with_variable_length_sequences_without_dropout PASSED [ 69%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_without_sequence_lengths_with_dropout PASSED [ 70%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lstm_unilayer_forward_with_initial_state_without_sequence_lengths_without_dropout PASSED [ 70%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lt PASSED [ 70%] 
Dec 10 03:52:36 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_lt_scalar PASSED [ 70%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_mask_rcnn FAILED [ 70%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_masked_fill PASSED [ 70%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_masked_fill_inplace PASSED [ 71%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_masked_scatter PASSED [ 71%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_masked_select PASSED [ 71%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_matmul PASSED [ 71%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_matmul_batch PASSED [ 71%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_max_tensors PASSED [ 72%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_maxpool PASSED [ 72%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_maxpool_1d_ceil PASSED [ 72%] 
Dec 10 03:52:53 test/onnx/test_pytorch_onnx_onnxruntime.py::TestONNXRuntime_opset12_onnx_shape_inference::test_maxpool_2d PASSED [ 72%] 

See CircleCI build pytorch_xla_linux_bionic_py3_6_clang9_build (4/4)

Step: "Build" (full log | diagnosis details | 🔁 rerun)

Dec 10 02:49:18 sccache: error: couldn't connect to server
Dec 10 02:49:18 +++ eval 'extract_trap_cmd ' 
Dec 10 02:49:18 ++++ extract_trap_cmd 
Dec 10 02:49:18 ++++ printf '%s\n' '' 
Dec 10 02:49:18 +++ printf '%s\n' cleanup 
Dec 10 02:49:18 ++ trap -- ' 
Dec 10 02:49:18 cleanup' EXIT 
Dec 10 02:49:18 ++ [[ pytorch-xla-linux-bionic-py3.6-clang9-build != *pytorch-win-* ]] 
Dec 10 02:49:18 ++ which sccache 
Dec 10 02:49:18 ++ sccache --stop-server 
Dec 10 02:49:18 Stopping sccache server... 
Dec 10 02:49:18 sccache: error: couldn't connect to server 
Dec 10 02:49:18 sccache: caused by: Connection refused (os error 111) 
Dec 10 02:49:18 ++ true 
Dec 10 02:49:18 ++ rm /var/lib/jenkins/sccache_error.log 
Dec 10 02:49:18 rm: cannot remove '/var/lib/jenkins/sccache_error.log': No such file or directory 
Dec 10 02:49:18 ++ true 
Dec 10 02:49:18 ++ [[ pytorch-xla-linux-bionic-py3.6-clang9-build == *rocm* ]] 
Dec 10 02:49:18 ++ SCCACHE_ERROR_LOG=/var/lib/jenkins/sccache_error.log 
Dec 10 02:49:18 ++ SCCACHE_IDLE_TIMEOUT=1200 
Dec 10 02:49:18 ++ RUST_LOG=sccache::server=error 
Dec 10 02:49:18 ++ sccache --start-server 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 9 times.

I want the names positional and kwarg_only to give the unflat
representation (e.g., preserving TensorOptionsArguments in the
returned Union).  So I regret my original naming choice when
I moved grouping to model.  This renames them to have flat_ prefix
and also adds a flat_non_out argument for cases where you just
want to look at non-out arguments.

Signed-off-by: Edward Z. Yang <ezyang@fb.com>

[ghstack-poisoned]
I want the names positional and kwarg_only to give the unflat
representation (e.g., preserving TensorOptionsArguments in the
returned Union).  So I regret my original naming choice when
I moved grouping to model.  This renames them to have flat_ prefix
and also adds a flat_non_out argument for cases where you just
want to look at non-out arguments.

Signed-off-by: Edward Z. Yang <ezyang@fb.com>

[ghstack-poisoned]
@facebook-github-bot
Copy link
Copy Markdown
Contributor

@ezyang merged this pull request in 267641a.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants