Skip to content

[1.6] Fix dictConstruct ordering and enable dict mix in tracing/scripting#40797

Merged
malfet merged 1 commit intopytorch:release/1.6from
wanchaol:1.6jitdict
Jul 1, 2020
Merged

[1.6] Fix dictConstruct ordering and enable dict mix in tracing/scripting#40797
malfet merged 1 commit intopytorch:release/1.6from
wanchaol:1.6jitdict

Conversation

@wanchaol
Copy link
Collaborator

A combination of #39601 and
#40424 both are approved and
merged in master

A combination of pytorch#39601 and
pytorch#40424 both are approved and
merged in master
@wanchaol wanchaol requested a review from apaszke as a code owner June 30, 2020 18:24
@facebook-github-bot facebook-github-bot added the oncall: jit Add this issue/PR to JIT oncall triage queue label Jun 30, 2020
@wanchaol wanchaol changed the title [1.6] Fix dictConstruct ordering and enable dict mix [1.6] Fix dictConstruct ordering and enable dict mix in tracing/scripting Jun 30, 2020
@dr-ci
Copy link

dr-ci bot commented Jun 30, 2020

💊 CI failures summary and remediations

As of commit 4f8d61d (more details on the Dr. CI page):


  • 6/6 failures introduced in this PR

🕵️ 6 new failures recognized by patterns

The following CI failures do not appear to be due to upstream breakages:

See CircleCI build pytorch_python_doc_push (1/6)

Step: "Doc Build and Push" (full log | diagnosis details | 🔁 rerun)

Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp:43:17: error: 'struct decodePNG(const at::Tensor&)::Reader' has no member named 'ptr'
Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp:37:37: error: 'png_const_bytep' was not declared in this scope 
Jun 30 19:02:31    reader.ptr = png_const_bytep(datap) + 8; 
Jun 30 19:02:31                                      ^ 
Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp: In lambda function: 
Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp:42:27: error: 'struct decodePNG(const at::Tensor&)::Reader' has no member named 'ptr' 
Jun 30 19:02:31          std::copy(reader->ptr, reader->ptr + bytes, output); 
Jun 30 19:02:31                            ^ 
Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp:42:40: error: 'struct decodePNG(const at::Tensor&)::Reader' has no member named 'ptr' 
Jun 30 19:02:31          std::copy(reader->ptr, reader->ptr + bytes, output); 
Jun 30 19:02:31                                         ^ 
Jun 30 19:02:31 /var/lib/jenkins/workspace/vision/torchvision/csrc/cpu/image/readpng_cpu.cpp:43:17: error: 'struct decodePNG(const at::Tensor&)::Reader' has no member named 'ptr' 
Jun 30 19:02:31          reader->ptr += bytes; 
Jun 30 19:02:31                  ^ 
Jun 30 19:02:31 error: command 'gcc' failed with exit status 1 

See CircleCI build pytorch_linux_bionic_py3_6_clang9_test (2/6)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jun 30 19:37:12 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future
Jun 30 19:37:12 At: 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:37:12  
Jun 30 19:37:12 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 19:37:12  
Jun 30 19:37:12 At: 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:37:12  
Jun 30 19:37:12 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 19:37:12  
Jun 30 19:37:12 At: 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:37:12   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:37:12  
Jun 30 19:37:12 [W tensorpipe_agent.cpp:491] RPC agent for worker2 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 19:37:13 ok (1.219s) 
Jun 30 19:37:13   test_return_future_remote (__main__.TensorPipeAgentRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:491] RPC agent for worker0 encountered error when reading incoming request from worker3: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 19:37:14 ok (1.219s) 
Jun 30 19:37:15   test_return_local_rrefs (__main__.TensorPipeAgentRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:491] RPC agent for worker1 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 

See CircleCI build pytorch_linux_xenial_py3_6_gcc5_4_test (3/6)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jun 30 19:41:00 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future
Jun 30 19:41:00 At: 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:41:00  
Jun 30 19:41:00 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 19:41:00  
Jun 30 19:41:00 At: 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:41:00  
Jun 30 19:41:00 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 19:41:00  
Jun 30 19:41:00 At: 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 19:41:00   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 19:41:00  
Jun 30 19:41:00 [W tensorpipe_agent.cpp:491] RPC agent for worker2 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 19:41:01 ok (1.321s) 
Jun 30 19:41:02   test_return_future_remote (__main__.TensorPipeAgentRpcTestWithSpawn) ... ok (1.322s) 
Jun 30 19:41:03   test_return_local_rrefs (__main__.TensorPipeAgentRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:491] RPC agent for worker2 encountered error when reading incoming request from worker3: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 19:41:03 ok (1.323s) 

See CircleCI build pytorch_macos_10_13_py3_test (4/6)

Step: "Test" (full log | diagnosis details | 🔁 rerun)

Jun 30 12:42:44 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future
Jun 30 12:42:44 At: 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 12:42:44  
Jun 30 12:42:44 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 12:42:44  
Jun 30 12:42:44 At: 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 12:42:44  
Jun 30 12:42:44 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 12:42:44  
Jun 30 12:42:44 At: 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 12:42:44   /Users/distiller/workspace/miniconda3/lib/python3.7/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 12:42:44  
Jun 30 12:42:44 [W tensorpipe_agent.cpp:491] RPC agent for worker2 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 12:42:44 [W tensorpipe_agent.cpp:491] RPC agent for worker0 encountered error when reading incoming request from worker2: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 12:42:44 [W tensorpipe_agent.cpp:491] RPC agent for worker3 encountered error when reading incoming request from worker2: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 12:42:44 ok (1.648s) 
Jun 30 12:42:45   test_return_future_remote (__main__.TensorPipeAgentRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:491] RPC agent for worker3 encountered error when reading incoming request from worker2: EOF: end of file (this is expected to happen during shutdown) 

See CircleCI build pytorch_xla_linux_bionic_py3_6_clang9_build (5/6)

Step: "Build" (full log | diagnosis details | 🔁 rerun)

Jun 30 18:33:09 RuntimeError: Failed to apply patch: /var/lib/jenkins/workspace/xla/torch_patches/X10-serialization.diff
Jun 30 18:33:09 + '[' -f /var/lib/jenkins/workspace/xla/scripts/../torch_patches/.torch_pin ']' 
Jun 30 18:33:09 + python /var/lib/jenkins/workspace/xla/scripts/cond_patch.py /var/lib/jenkins/workspace/xla/scripts/../torch_patches /var/lib/jenkins/workspace/xla/scripts/../.. 
Jun 30 18:33:09 1 out of 1 hunk FAILED 
Jun 30 18:33:09 Applying patch file: /var/lib/jenkins/workspace/xla/torch_patches/X10-clip_grad.diff 
Jun 30 18:33:09 Applying patch file: /var/lib/jenkins/workspace/xla/torch_patches/X10-serialization.diff 
Jun 30 18:33:09 Traceback (most recent call last): 
Jun 30 18:33:09   File "/var/lib/jenkins/workspace/xla/scripts/cond_patch.py", line 67, in <module> 
Jun 30 18:33:09     patch_repo(args) 
Jun 30 18:33:09   File "/var/lib/jenkins/workspace/xla/scripts/cond_patch.py", line 49, in patch_repo 
Jun 30 18:33:09     raise RuntimeError('Failed to apply patch: {}'.format(ppath)) 
Jun 30 18:33:09 RuntimeError: Failed to apply patch: /var/lib/jenkins/workspace/xla/torch_patches/X10-serialization.diff 
Jun 30 18:33:09 =================== sccache compilation log =================== 
Jun 30 18:33:09 + cleanup 
Jun 30 18:33:09 + retcode=1 
Jun 30 18:33:09 + set +x 
Jun 30 18:33:09 =========== If your build fails, please take a look at the log above for possible reasons =========== 
Jun 30 18:33:09 Compile requests                 1 
Jun 30 18:33:09 Compile requests executed        0 
Jun 30 18:33:09 Cache hits                       0 
Jun 30 18:33:09 Cache misses                     0 
Jun 30 18:33:09 Cache timeouts                   0 

See CircleCI build pytorch_linux_xenial_cuda10_2_cudnn7_py3_gcc7_test (6/6)

Step: "Run tests" (full log | diagnosis details | 🔁 rerun)

Jun 30 20:42:57 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future
Jun 30 20:42:57 At: 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 20:42:57  
Jun 30 20:42:57 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 20:42:57  
Jun 30 20:42:57 At: 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 20:42:57  
Jun 30 20:42:57 [E request_callback_impl.cpp:168] Received error while processing request type 2: RuntimeError: Can not pickle torch.futures.Future 
Jun 30 20:42:57  
Jun 30 20:42:57 At: 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(93): serialize 
Jun 30 20:42:57   /opt/conda/lib/python3.6/site-packages/torch/distributed/rpc/internal.py(145): serialize 
Jun 30 20:42:57  
Jun 30 20:42:57 [W tensorpipe_agent.cpp:491] RPC agent for worker3 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 20:42:57 [W tensorpipe_agent.cpp:491] RPC agent for worker0 encountered error when reading incoming request from worker1: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 20:42:57 [W tensorpipe_agent.cpp:491] RPC agent for worker1 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 
Jun 30 20:42:57 ok (0.813s) 
Jun 30 20:42:58   test_return_future_remote (__main__.TensorPipeAgentRpcTestWithSpawn) ... [W tensorpipe_agent.cpp:491] RPC agent for worker2 encountered error when reading incoming request from worker0: EOF: end of file (this is expected to happen during shutdown) 

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 5 times.

@malfet malfet merged commit 41816dc into pytorch:release/1.6 Jul 1, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

oncall: jit Add this issue/PR to JIT oncall triage queue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants