-
Notifications
You must be signed in to change notification settings - Fork 27.4k
Pytorch 2.0 don't work with onnxruntime backend and torch2trt backend #90352
Description
🐛 Describe the bug
When I use pytorch 2.0 and try to use the compiled resnet18 with onnxruntime backend with gpu or torch2trt backend, an error occurs.
Code:
model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet18', pretrained=True)
device = torch.device('cuda')
model.to(device).eval()
model = torch.compile(model, backend='onnxrt')
input_data = torch.randn((6, 3, 224, 224))
input_data = input_data.to("cuda")
output_data = model(input_data)
Error:
torch._dynamo.exc.BackendCompilerFailed: onnxrt raised Exception: Invoking operators with non-Fake Tensor inputs in FakeTensorMode is not yet supported. Please convert all Tensors to FakeTensors first. Found in aten.convolution.default(*(FakeTensor(FakeTensor(..., device='meta', size=(6, 3, 224, 224)), cuda:0), Parameter containing:
tensor([[[[-1.0419e-02, -6.1356e-03, -1.8098e-03, ..., 5.6615e-02,
1.7083e-02, -1.2694e-02],
[ 1.1083e-02, 9.5276e-03, -1.0993e-01, ..., -2.7124e-01,
-1.2907e-01, 3.7424e-03],
[-6.9434e-03, 5.9089e-02, 2.9548e-01, ..., 5.1972e-01,
2.5632e-01, 6.3573e-02],
...,
[-2.7535e-02, 1.6045e-02, 7.2595e-02, ..., -3.3285e-01,
-4.2058e-01, -2.5781e-01],
[ 3.0613e-02, 4.0960e-02, 6.2850e-02, ..., 4.1384e-01,
3.9359e-01, 1.6606e-01],
[-1.3736e-02, -3.6746e-03, -2.4084e-02, ..., -1.5070e-01,
-8.2230e-02, -5.7828e-03]],
[[-1.1397e-02, -2.6619e-02, -3.4641e-02, ..., 3.2521e-02,
6.6221e-04, -2.5743e-02],
[ 4.5687e-02, 3.3603e-02, -1.0453e-01, ..., -3.1253e-01,
-1.6051e-01, -1.2826e-03],
[-8.3730e-04, 9.8420e-02, 4.0210e-01, ..., 7.0789e-01,
3.6887e-01, 1.2455e-01],
...,
[-5.5926e-02, -5.2239e-03, 2.7081e-02, ..., -4.6178e-01,
-5.7080e-01, -3.6552e-01],
[ 3.2860e-02, 5.5574e-02, 9.9670e-02, ..., 5.4636e-01,
4.8276e-01, 1.9867e-01],
[ 5.3051e-03, 6.6938e-03, -1.7254e-02, ..., -1.4822e-01,
-7.7248e-02, 7.2183e-04]],
[[-2.0315e-03, -9.1617e-03, 2.1209e-02, ..., 8.9177e-02,
3.3655e-02, -2.0102e-02],
[ 1.5398e-02, -1.8648e-02, -1.2591e-01, ..., -2.5342e-01,
-1.2980e-01, -2.7975e-02],
[ 9.8454e-03, 4.9047e-02, 2.1699e-01, ..., 3.4872e-01,
1.0433e-01, 1.8413e-02],
...,
[-2.8356e-02, 1.8404e-02, 9.8647e-02, ..., -1.1740e-01,
-2.5760e-01, -1.5451e-01],
[ 2.0766e-02, -2.6286e-03, -3.7825e-02, ..., 2.4141e-01,
2.4345e-01, 1.1796e-01],
[ 7.4684e-04, 7.7677e-04, -1.0050e-02, ..., -1.4865e-01,
-1.1754e-01, -3.8350e-02]]],
[[[-4.4154e-03, -4.0645e-03, 3.1589e-03, ..., -3.7026e-02,
-2.5158e-02, -4.7945e-02],
[ 5.1310e-02, 5.3402e-02, 8.0436e-02, ..., 1.4480e-01,
1.4287e-01, 1.2312e-01],
[-7.3337e-03, 2.1755e-03, 3.7580e-02, ..., 6.1517e-02,
8.0324e-02, 1.1715e-01],
...,
[-2.6754e-02, -1.2297e-01, -1.3653e-01, ..., -1.4068e-01,
-1.1155e-01, -4.9556e-02],
[ 2.3524e-02, -1.7288e-02, -1.1122e-02, ..., -1.8826e-02,
-2.3320e-02, -2.9474e-02],
[ 2.8689e-02, 2.1659e-02, 4.7888e-02, ..., 2.5498e-02,
3.5346e-02, 1.1280e-02]],
[[ 4.6919e-04, 1.2153e-02, 4.2035e-02, ..., 4.6403e-02,
4.0423e-02, -1.4439e-02],
[ 4.3463e-02, 6.8779e-02, 1.3268e-01, ..., 2.8606e-01,
2.6905e-01, 2.0935e-01],
[-5.7621e-02, -2.2642e-02, 3.0547e-02, ..., 1.3763e-01,
1.6538e-01, 1.7946e-01],
...,
[-1.0816e-01, -2.5227e-01, -2.9742e-01, ..., -2.8503e-01,
-2.1493e-01, -1.0320e-01],
[ 4.0709e-02, -3.2771e-02, -6.3450e-02, ..., -9.2360e-02,
-6.9876e-02, -4.9841e-02],
[ 8.2942e-02, 8.7580e-02, 1.0111e-01, ..., 5.2714e-02,
6.0968e-02, 4.1198e-02]],
[[-1.6391e-02, -1.3870e-02, 5.2810e-03, ..., 4.3698e-02,
2.2707e-02, -4.5983e-02],
[ 3.3202e-02, 4.2014e-02, 9.3500e-02, ..., 2.6162e-01,
2.2970e-01, 1.6694e-01],
[-4.5987e-02, -1.6365e-02, 2.6811e-02, ..., 1.4951e-01,
1.3216e-01, 1.3579e-01],
...,
[-7.2129e-02, -1.8902e-01, -2.3389e-01, ..., -1.9038e-01,
-1.5609e-01, -7.5974e-02],
[ 5.1161e-02, -2.5815e-02, -6.9357e-02, ..., -5.8999e-02,
-6.1550e-02, -4.4555e-02],
[ 1.1174e-01, 7.8979e-02, 6.5849e-02, ..., 3.1617e-02,
2.5221e-02, 7.4257e-03]]],
[[[-7.0826e-08, -6.4306e-08, -7.3806e-08, ..., -9.8000e-08,
-1.0905e-07, -8.3421e-08],
[-6.1125e-09, 2.0613e-09, -8.0922e-09, ..., -4.9840e-08,
-4.3836e-08, -3.0538e-09],
[ 7.1953e-08, 7.5616e-08, 5.9282e-08, ..., -9.7509e-09,
-1.0951e-09, 4.2442e-08],
...,
[ 9.5889e-08, 1.0039e-07, 7.9817e-08, ..., -1.7491e-08,
-4.7666e-08, -1.3265e-08],
[ 1.2904e-07, 1.4762e-07, 1.7477e-07, ..., 1.3233e-07,
1.0628e-07, 9.3316e-08],
[ 1.2558e-07, 1.3644e-07, 1.8431e-07, ..., 2.1399e-07,
1.7710e-07, 1.7166e-07]],
[[-1.2690e-07, -9.6139e-08, -1.0372e-07, ..., -1.1808e-07,
-1.3309e-07, -1.0820e-07],
[-5.7412e-08, -2.5055e-08, -3.0115e-08, ..., -7.2922e-08,
-6.7022e-08, -2.2574e-08],
[ 2.1813e-08, 4.8608e-08, 3.1222e-08, ..., -1.8694e-08,
-7.9591e-09, 3.9750e-08],
...,
[ 5.6013e-08, 7.5526e-08, 4.4496e-08, ..., -4.4128e-08,
-5.9930e-08, -1.8247e-08],
[ 7.7614e-08, 9.8348e-08, 1.0455e-07, ..., 6.3272e-08,
4.1781e-08, 4.5901e-08],
[ 5.9834e-08, 7.1006e-08, 9.0437e-08, ..., 1.1654e-07,
8.7550e-08, 9.8837e-08]],
[[-4.3810e-08, 1.3270e-08, 7.8275e-09, ..., -5.8804e-09,
-2.6217e-08, -1.5649e-08],
[ 4.1700e-08, 1.0778e-07, 1.0946e-07, ..., 7.6403e-08,
7.1450e-08, 9.7615e-08],
[ 1.0436e-07, 1.6586e-07, 1.5933e-07, ..., 1.3517e-07,
1.3487e-07, 1.6449e-07],
...,
[ 9.8763e-08, 1.5072e-07, 1.2547e-07, ..., 6.8316e-08,
6.8382e-08, 1.1367e-07],
[ 9.1435e-08, 1.3576e-07, 1.3793e-07, ..., 1.1678e-07,
1.1723e-07, 1.4394e-07],
[ 6.2183e-08, 8.8184e-08, 1.0456e-07, ..., 1.3941e-07,
1.3333e-07, 1.5844e-07]]],
...,
[[[-6.1896e-02, -3.0206e-02, 1.9225e-02, ..., 4.3665e-02,
-2.2114e-02, -4.2214e-02],
[-3.8061e-02, 6.0774e-03, 4.5797e-02, ..., 9.6029e-02,
5.9254e-02, 2.9958e-02],
[-2.9672e-02, 2.7766e-03, 2.0457e-02, ..., 5.9828e-02,
4.1422e-02, 2.3134e-02],
...,
[ 1.1916e-02, 4.5701e-02, 4.4892e-02, ..., 4.7419e-02,
2.2274e-02, -5.4993e-03],
[-3.2468e-02, -1.2210e-02, 2.2023e-02, ..., 5.8061e-02,
-7.5033e-03, -5.9736e-02],
[-4.3314e-02, -2.8162e-02, -5.9126e-03, ..., 8.8460e-02,
8.4406e-03, -5.0019e-02]],
[[-6.1292e-02, -1.4004e-02, 1.7229e-02, ..., 1.8349e-02,
-3.2708e-02, -4.1060e-02],
[-3.1506e-02, 2.4460e-02, 4.5516e-02, ..., 6.6806e-02,
4.6687e-02, 3.3248e-02],
[-3.2216e-02, 2.0718e-02, 2.3343e-02, ..., 3.5265e-02,
3.6478e-02, 3.1291e-02],
...,
[ 1.7739e-02, 6.1040e-02, 4.8247e-02, ..., 3.7785e-02,
2.8894e-02, 1.3984e-02],
[-1.0890e-02, 2.2079e-02, 4.2737e-02, ..., 6.0247e-02,
1.6197e-02, -1.2493e-02],
[-2.2284e-02, 1.3220e-02, 3.0897e-02, ..., 1.0403e-01,
4.0119e-02, -5.3310e-03]],
[[-8.5322e-02, -4.2603e-02, 6.8145e-03, ..., 3.0751e-02,
-3.4818e-02, -4.9945e-02],
[-2.9215e-02, 1.8165e-02, 5.1092e-02, ..., 9.0200e-02,
5.3438e-02, 4.0169e-02],
[-3.9932e-02, -1.1100e-03, 9.6176e-03, ..., 2.4114e-02,
2.6298e-02, 2.5489e-02],
...,
[-3.1890e-03, 3.0454e-02, 1.6316e-02, ..., 5.5054e-03,
-6.2689e-03, -8.4638e-03],
[-2.2995e-02, -2.8211e-03, 2.3203e-02, ..., 3.5888e-02,
-1.4296e-02, -3.2419e-02],
[-9.8894e-03, 7.0542e-03, 1.0659e-02, ..., 7.0495e-02,
1.2996e-02, -8.3417e-03]]],
[[[-7.8699e-03, 1.9911e-02, 3.4208e-02, ..., 2.8694e-02,
1.2820e-02, 1.8142e-02],
[ 8.7942e-03, -3.2875e-02, -3.5713e-02, ..., 7.2533e-02,
4.5889e-02, 5.2383e-02],
[-3.6122e-02, -1.1878e-01, -1.3767e-01, ..., 3.3811e-02,
3.7806e-02, 2.6944e-02],
...,
[ 1.7322e-02, 3.9589e-03, -8.2269e-03, ..., 2.7543e-03,
1.8313e-02, 1.6057e-02],
[-9.5007e-04, 1.6428e-02, 1.7156e-02, ..., 3.3672e-03,
2.2857e-02, 6.5783e-04],
[ 6.1727e-03, 2.7145e-02, 1.4340e-02, ..., 7.5867e-03,
1.8770e-02, 1.5624e-02]],
[[-1.3423e-02, -5.0696e-04, 8.0959e-03, ..., -6.0963e-03,
9.2341e-03, 1.5751e-02],
[-1.8343e-02, -6.7982e-02, -7.0685e-02, ..., 2.9855e-02,
2.6264e-02, 2.3773e-02],
[-5.4359e-02, -1.4663e-01, -1.6211e-01, ..., 1.1781e-02,
3.2477e-02, 1.1980e-02],
...,
[ 8.3686e-04, -1.7564e-02, -1.9535e-02, ..., -4.1382e-03,
2.4658e-02, 1.2893e-02],
[-6.3183e-04, 1.1788e-02, 2.4810e-02, ..., 6.1105e-03,
3.9210e-02, 9.6696e-03],
[-7.1831e-03, 6.6918e-03, 5.2723e-03, ..., -7.6077e-03,
2.7253e-02, 1.7735e-02]],
[[-2.3753e-04, -4.9343e-03, 2.2991e-03, ..., -4.7958e-02,
-2.6154e-02, -2.3525e-02],
[-3.3053e-04, -5.1502e-02, -5.9977e-02, ..., -1.7369e-02,
-2.3337e-02, -3.7312e-02],
[-2.2674e-02, -9.9412e-02, -1.1176e-01, ..., -1.1725e-02,
-8.3744e-03, -4.0615e-02],
...,
[ 1.1437e-02, -8.0313e-03, -1.4955e-03, ..., -3.4133e-02,
-8.7267e-03, -2.3526e-02],
[ 2.9522e-03, 6.7770e-04, 1.9933e-02, ..., -2.2002e-02,
1.4814e-02, -1.4487e-02],
[-1.9085e-02, -2.9430e-02, -2.3284e-02, ..., -4.8587e-02,
-1.3049e-02, -2.4368e-02]]],
[[[-3.6296e-02, 7.1996e-03, 1.9100e-02, ..., 1.9602e-02,
1.4870e-02, -1.7298e-02],
[-1.1061e-02, 8.5665e-02, 1.2667e-01, ..., 1.3744e-02,
-5.5036e-05, -3.0162e-02],
[ 1.1322e-01, 1.8634e-01, 5.0658e-02, ..., -1.7333e-01,
-7.2041e-02, -6.2474e-02],
...,
[-5.3062e-02, -2.5781e-01, -2.6747e-01, ..., 2.6781e-01,
1.4344e-01, 5.5145e-02],
[-2.1009e-02, -2.9969e-02, 1.0245e-01, ..., 2.0843e-01,
-4.1518e-03, -3.8118e-02],
[-2.2155e-02, 1.2380e-02, 8.4302e-02, ..., -4.4992e-02,
-1.4687e-01, -9.0890e-02]],
[[-5.3969e-03, 3.2799e-02, 1.5486e-02, ..., -7.7451e-03,
3.0229e-03, 1.1216e-03],
[ 6.1723e-02, 1.4899e-01, 1.4645e-01, ..., -2.8897e-02,
-2.0227e-02, -9.1878e-03],
[ 1.6146e-01, 2.0886e-01, -2.5589e-02, ..., -2.7278e-01,
-1.0735e-01, -6.2971e-02],
...,
[-1.3723e-01, -4.0863e-01, -3.8551e-01, ..., 4.0846e-01,
2.6202e-01, 1.3491e-01],
[-5.9388e-02, -6.1187e-02, 1.4197e-01, ..., 3.5780e-01,
9.0893e-02, -1.7392e-03],
[ 7.8613e-03, 5.8403e-02, 1.5339e-01, ..., 4.7045e-02,
-1.0095e-01, -9.7920e-02]],
[[-5.6799e-03, 1.3425e-02, -2.6461e-02, ..., 4.4881e-03,
2.0666e-03, 1.3902e-02],
[ 6.5943e-03, 4.5181e-02, 6.0260e-02, ..., 1.4368e-02,
-5.0725e-03, 4.0505e-03],
[ 5.5257e-02, 1.2397e-01, 4.3193e-02, ..., -1.4486e-01,
-7.4489e-02, -5.7533e-02],
...,
[-3.1513e-02, -1.6334e-01, -1.5795e-01, ..., 2.2904e-01,
1.2017e-01, 7.1998e-02],
[-1.0456e-02, -1.1248e-03, 8.4582e-02, ..., 1.5748e-01,
2.2142e-02, -1.0083e-02],
[-4.8639e-03, -5.0065e-03, 3.6341e-02, ..., -2.4361e-02,
-7.1195e-02, -6.6788e-02]]]], device='cuda:0', requires_grad=True), None, [2, 2], [3, 3], [1, 1], False, [0, 0], 1), **{})
For torch2trt
Code:
model = torch.hub.load('pytorch/vision:v0.10.0', 'resnet18', pretrained=True)
device = torch.device('cuda')
model.to(device).eval()
model = torch.compile(model, backend='torch2trt')
input_data = torch.randn((6, 3, 224, 224))
input_data = input_data.to("cuda")
output_data = model(input_data)
Error:
torch._dynamo.exc.BackendCompilerFailed: torch2trt raised Exception: Invoking operators with non-Fake Tensor inputs in FakeTensorMode is not yet supported. Please convert all Tensors to FakeTensors first. Found in aten.convolution.default(*(FakeTensor(FakeTensor(..., device='meta', size=(6, 3, 224, 224)), cuda:0), Parameter containing:
tensor([[[[-1.0419e-02, -6.1356e-03, -1.8098e-03, ..., 5.6615e-02,
1.7083e-02, -1.2694e-02],
[ 1.1083e-02, 9.5276e-03, -1.0993e-01, ..., -2.7124e-01,
-1.2907e-01, 3.7424e-03],
[-6.9434e-03, 5.9089e-02, 2.9548e-01, ..., 5.1972e-01,
2.5632e-01, 6.3573e-02],
...,
[-2.7535e-02, 1.6045e-02, 7.2595e-02, ..., -3.3285e-01,
-4.2058e-01, -2.5781e-01],
[ 3.0613e-02, 4.0960e-02, 6.2850e-02, ..., 4.1384e-01,
3.9359e-01, 1.6606e-01],
[-1.3736e-02, -3.6746e-03, -2.4084e-02, ..., -1.5070e-01,
-8.2230e-02, -5.7828e-03]],
[[-1.1397e-02, -2.6619e-02, -3.4641e-02, ..., 3.2521e-02,
6.6221e-04, -2.5743e-02],
[ 4.5687e-02, 3.3603e-02, -1.0453e-01, ..., -3.1253e-01,
-1.6051e-01, -1.2826e-03],
[-8.3730e-04, 9.8420e-02, 4.0210e-01, ..., 7.0789e-01,
3.6887e-01, 1.2455e-01],
...,
[-5.5926e-02, -5.2239e-03, 2.7081e-02, ..., -4.6178e-01,
-5.7080e-01, -3.6552e-01],
[ 3.2860e-02, 5.5574e-02, 9.9670e-02, ..., 5.4636e-01,
4.8276e-01, 1.9867e-01],
[ 5.3051e-03, 6.6938e-03, -1.7254e-02, ..., -1.4822e-01,
-7.7248e-02, 7.2183e-04]],
[[-2.0315e-03, -9.1617e-03, 2.1209e-02, ..., 8.9177e-02,
3.3655e-02, -2.0102e-02],
[ 1.5398e-02, -1.8648e-02, -1.2591e-01, ..., -2.5342e-01,
-1.2980e-01, -2.7975e-02],
[ 9.8454e-03, 4.9047e-02, 2.1699e-01, ..., 3.4872e-01,
1.0433e-01, 1.8413e-02],
...,
[-2.8356e-02, 1.8404e-02, 9.8647e-02, ..., -1.1740e-01,
-2.5760e-01, -1.5451e-01],
[ 2.0766e-02, -2.6286e-03, -3.7825e-02, ..., 2.4141e-01,
2.4345e-01, 1.1796e-01],
[ 7.4684e-04, 7.7677e-04, -1.0050e-02, ..., -1.4865e-01,
-1.1754e-01, -3.8350e-02]]],
[[[-4.4154e-03, -4.0645e-03, 3.1589e-03, ..., -3.7026e-02,
-2.5158e-02, -4.7945e-02],
[ 5.1310e-02, 5.3402e-02, 8.0436e-02, ..., 1.4480e-01,
1.4287e-01, 1.2312e-01],
[-7.3337e-03, 2.1755e-03, 3.7580e-02, ..., 6.1517e-02,
8.0324e-02, 1.1715e-01],
...,
[-2.6754e-02, -1.2297e-01, -1.3653e-01, ..., -1.4068e-01,
-1.1155e-01, -4.9556e-02],
[ 2.3524e-02, -1.7288e-02, -1.1122e-02, ..., -1.8826e-02,
-2.3320e-02, -2.9474e-02],
[ 2.8689e-02, 2.1659e-02, 4.7888e-02, ..., 2.5498e-02,
3.5346e-02, 1.1280e-02]],
[[ 4.6919e-04, 1.2153e-02, 4.2035e-02, ..., 4.6403e-02,
4.0423e-02, -1.4439e-02],
[ 4.3463e-02, 6.8779e-02, 1.3268e-01, ..., 2.8606e-01,
2.6905e-01, 2.0935e-01],
[-5.7621e-02, -2.2642e-02, 3.0547e-02, ..., 1.3763e-01,
1.6538e-01, 1.7946e-01],
...,
[-1.0816e-01, -2.5227e-01, -2.9742e-01, ..., -2.8503e-01,
-2.1493e-01, -1.0320e-01],
[ 4.0709e-02, -3.2771e-02, -6.3450e-02, ..., -9.2360e-02,
-6.9876e-02, -4.9841e-02],
[ 8.2942e-02, 8.7580e-02, 1.0111e-01, ..., 5.2714e-02,
6.0968e-02, 4.1198e-02]],
[[-1.6391e-02, -1.3870e-02, 5.2810e-03, ..., 4.3698e-02,
2.2707e-02, -4.5983e-02],
[ 3.3202e-02, 4.2014e-02, 9.3500e-02, ..., 2.6162e-01,
2.2970e-01, 1.6694e-01],
[-4.5987e-02, -1.6365e-02, 2.6811e-02, ..., 1.4951e-01,
1.3216e-01, 1.3579e-01],
...,
[-7.2129e-02, -1.8902e-01, -2.3389e-01, ..., -1.9038e-01,
-1.5609e-01, -7.5974e-02],
[ 5.1161e-02, -2.5815e-02, -6.9357e-02, ..., -5.8999e-02,
-6.1550e-02, -4.4555e-02],
[ 1.1174e-01, 7.8979e-02, 6.5849e-02, ..., 3.1617e-02,
2.5221e-02, 7.4257e-03]]],
[[[-7.0826e-08, -6.4306e-08, -7.3806e-08, ..., -9.8000e-08,
-1.0905e-07, -8.3421e-08],
[-6.1125e-09, 2.0613e-09, -8.0922e-09, ..., -4.9840e-08,
-4.3836e-08, -3.0538e-09],
[ 7.1953e-08, 7.5616e-08, 5.9282e-08, ..., -9.7509e-09,
-1.0951e-09, 4.2442e-08],
...,
[ 9.5889e-08, 1.0039e-07, 7.9817e-08, ..., -1.7491e-08,
-4.7666e-08, -1.3265e-08],
[ 1.2904e-07, 1.4762e-07, 1.7477e-07, ..., 1.3233e-07,
1.0628e-07, 9.3316e-08],
[ 1.2558e-07, 1.3644e-07, 1.8431e-07, ..., 2.1399e-07,
1.7710e-07, 1.7166e-07]],
[[-1.2690e-07, -9.6139e-08, -1.0372e-07, ..., -1.1808e-07,
-1.3309e-07, -1.0820e-07],
[-5.7412e-08, -2.5055e-08, -3.0115e-08, ..., -7.2922e-08,
-6.7022e-08, -2.2574e-08],
[ 2.1813e-08, 4.8608e-08, 3.1222e-08, ..., -1.8694e-08,
-7.9591e-09, 3.9750e-08],
...,
[ 5.6013e-08, 7.5526e-08, 4.4496e-08, ..., -4.4128e-08,
-5.9930e-08, -1.8247e-08],
[ 7.7614e-08, 9.8348e-08, 1.0455e-07, ..., 6.3272e-08,
4.1781e-08, 4.5901e-08],
[ 5.9834e-08, 7.1006e-08, 9.0437e-08, ..., 1.1654e-07,
8.7550e-08, 9.8837e-08]],
[[-4.3810e-08, 1.3270e-08, 7.8275e-09, ..., -5.8804e-09,
-2.6217e-08, -1.5649e-08],
[ 4.1700e-08, 1.0778e-07, 1.0946e-07, ..., 7.6403e-08,
7.1450e-08, 9.7615e-08],
[ 1.0436e-07, 1.6586e-07, 1.5933e-07, ..., 1.3517e-07,
1.3487e-07, 1.6449e-07],
...,
[ 9.8763e-08, 1.5072e-07, 1.2547e-07, ..., 6.8316e-08,
6.8382e-08, 1.1367e-07],
[ 9.1435e-08, 1.3576e-07, 1.3793e-07, ..., 1.1678e-07,
1.1723e-07, 1.4394e-07],
[ 6.2183e-08, 8.8184e-08, 1.0456e-07, ..., 1.3941e-07,
1.3333e-07, 1.5844e-07]]],
...,
[[[-6.1896e-02, -3.0206e-02, 1.9225e-02, ..., 4.3665e-02,
-2.2114e-02, -4.2214e-02],
[-3.8061e-02, 6.0774e-03, 4.5797e-02, ..., 9.6029e-02,
5.9254e-02, 2.9958e-02],
[-2.9672e-02, 2.7766e-03, 2.0457e-02, ..., 5.9828e-02,
4.1422e-02, 2.3134e-02],
...,
[ 1.1916e-02, 4.5701e-02, 4.4892e-02, ..., 4.7419e-02,
2.2274e-02, -5.4993e-03],
[-3.2468e-02, -1.2210e-02, 2.2023e-02, ..., 5.8061e-02,
-7.5033e-03, -5.9736e-02],
[-4.3314e-02, -2.8162e-02, -5.9126e-03, ..., 8.8460e-02,
8.4406e-03, -5.0019e-02]],
[[-6.1292e-02, -1.4004e-02, 1.7229e-02, ..., 1.8349e-02,
-3.2708e-02, -4.1060e-02],
[-3.1506e-02, 2.4460e-02, 4.5516e-02, ..., 6.6806e-02,
4.6687e-02, 3.3248e-02],
[-3.2216e-02, 2.0718e-02, 2.3343e-02, ..., 3.5265e-02,
3.6478e-02, 3.1291e-02],
...,
[ 1.7739e-02, 6.1040e-02, 4.8247e-02, ..., 3.7785e-02,
2.8894e-02, 1.3984e-02],
[-1.0890e-02, 2.2079e-02, 4.2737e-02, ..., 6.0247e-02,
1.6197e-02, -1.2493e-02],
[-2.2284e-02, 1.3220e-02, 3.0897e-02, ..., 1.0403e-01,
4.0119e-02, -5.3310e-03]],
[[-8.5322e-02, -4.2603e-02, 6.8145e-03, ..., 3.0751e-02,
-3.4818e-02, -4.9945e-02],
[-2.9215e-02, 1.8165e-02, 5.1092e-02, ..., 9.0200e-02,
5.3438e-02, 4.0169e-02],
[-3.9932e-02, -1.1100e-03, 9.6176e-03, ..., 2.4114e-02,
2.6298e-02, 2.5489e-02],
...,
[-3.1890e-03, 3.0454e-02, 1.6316e-02, ..., 5.5054e-03,
-6.2689e-03, -8.4638e-03],
[-2.2995e-02, -2.8211e-03, 2.3203e-02, ..., 3.5888e-02,
-1.4296e-02, -3.2419e-02],
[-9.8894e-03, 7.0542e-03, 1.0659e-02, ..., 7.0495e-02,
1.2996e-02, -8.3417e-03]]],
[[[-7.8699e-03, 1.9911e-02, 3.4208e-02, ..., 2.8694e-02,
1.2820e-02, 1.8142e-02],
[ 8.7942e-03, -3.2875e-02, -3.5713e-02, ..., 7.2533e-02,
4.5889e-02, 5.2383e-02],
[-3.6122e-02, -1.1878e-01, -1.3767e-01, ..., 3.3811e-02,
3.7806e-02, 2.6944e-02],
...,
[ 1.7322e-02, 3.9589e-03, -8.2269e-03, ..., 2.7543e-03,
1.8313e-02, 1.6057e-02],
[-9.5007e-04, 1.6428e-02, 1.7156e-02, ..., 3.3672e-03,
2.2857e-02, 6.5783e-04],
[ 6.1727e-03, 2.7145e-02, 1.4340e-02, ..., 7.5867e-03,
1.8770e-02, 1.5624e-02]],
[[-1.3423e-02, -5.0696e-04, 8.0959e-03, ..., -6.0963e-03,
9.2341e-03, 1.5751e-02],
[-1.8343e-02, -6.7982e-02, -7.0685e-02, ..., 2.9855e-02,
2.6264e-02, 2.3773e-02],
[-5.4359e-02, -1.4663e-01, -1.6211e-01, ..., 1.1781e-02,
3.2477e-02, 1.1980e-02],
...,
[ 8.3686e-04, -1.7564e-02, -1.9535e-02, ..., -4.1382e-03,
2.4658e-02, 1.2893e-02],
[-6.3183e-04, 1.1788e-02, 2.4810e-02, ..., 6.1105e-03,
3.9210e-02, 9.6696e-03],
[-7.1831e-03, 6.6918e-03, 5.2723e-03, ..., -7.6077e-03,
2.7253e-02, 1.7735e-02]],
[[-2.3753e-04, -4.9343e-03, 2.2991e-03, ..., -4.7958e-02,
-2.6154e-02, -2.3525e-02],
[-3.3053e-04, -5.1502e-02, -5.9977e-02, ..., -1.7369e-02,
-2.3337e-02, -3.7312e-02],
[-2.2674e-02, -9.9412e-02, -1.1176e-01, ..., -1.1725e-02,
-8.3744e-03, -4.0615e-02],
...,
[ 1.1437e-02, -8.0313e-03, -1.4955e-03, ..., -3.4133e-02,
-8.7267e-03, -2.3526e-02],
[ 2.9522e-03, 6.7770e-04, 1.9933e-02, ..., -2.2002e-02,
1.4814e-02, -1.4487e-02],
[-1.9085e-02, -2.9430e-02, -2.3284e-02, ..., -4.8587e-02,
-1.3049e-02, -2.4368e-02]]],
[[[-3.6296e-02, 7.1996e-03, 1.9100e-02, ..., 1.9602e-02,
1.4870e-02, -1.7298e-02],
[-1.1061e-02, 8.5665e-02, 1.2667e-01, ..., 1.3744e-02,
-5.5036e-05, -3.0162e-02],
[ 1.1322e-01, 1.8634e-01, 5.0658e-02, ..., -1.7333e-01,
-7.2041e-02, -6.2474e-02],
...,
[-5.3062e-02, -2.5781e-01, -2.6747e-01, ..., 2.6781e-01,
1.4344e-01, 5.5145e-02],
[-2.1009e-02, -2.9969e-02, 1.0245e-01, ..., 2.0843e-01,
-4.1518e-03, -3.8118e-02],
[-2.2155e-02, 1.2380e-02, 8.4302e-02, ..., -4.4992e-02,
-1.4687e-01, -9.0890e-02]],
[[-5.3969e-03, 3.2799e-02, 1.5486e-02, ..., -7.7451e-03,
3.0229e-03, 1.1216e-03],
[ 6.1723e-02, 1.4899e-01, 1.4645e-01, ..., -2.8897e-02,
-2.0227e-02, -9.1878e-03],
[ 1.6146e-01, 2.0886e-01, -2.5589e-02, ..., -2.7278e-01,
-1.0735e-01, -6.2971e-02],
...,
[-1.3723e-01, -4.0863e-01, -3.8551e-01, ..., 4.0846e-01,
2.6202e-01, 1.3491e-01],
[-5.9388e-02, -6.1187e-02, 1.4197e-01, ..., 3.5780e-01,
9.0893e-02, -1.7392e-03],
[ 7.8613e-03, 5.8403e-02, 1.5339e-01, ..., 4.7045e-02,
-1.0095e-01, -9.7920e-02]],
[[-5.6799e-03, 1.3425e-02, -2.6461e-02, ..., 4.4881e-03,
2.0666e-03, 1.3902e-02],
[ 6.5943e-03, 4.5181e-02, 6.0260e-02, ..., 1.4368e-02,
-5.0725e-03, 4.0505e-03],
[ 5.5257e-02, 1.2397e-01, 4.3193e-02, ..., -1.4486e-01,
-7.4489e-02, -5.7533e-02],
...,
[-3.1513e-02, -1.6334e-01, -1.5795e-01, ..., 2.2904e-01,
1.2017e-01, 7.1998e-02],
[-1.0456e-02, -1.1248e-03, 8.4582e-02, ..., 1.5748e-01,
2.2142e-02, -1.0083e-02],
[-4.8639e-03, -5.0065e-03, 3.6341e-02, ..., -2.4361e-02,
-7.1195e-02, -6.6788e-02]]]], device='cuda:0', requires_grad=True), None, [2, 2], [3, 3], [1, 1], False, [0, 0], 1), **{})
Versions
PyTorch version: 1.14.0.dev20221206+cu116
Is debug build: False
CUDA used to build PyTorch: 11.6
ROCM used to build PyTorch: N/A
OS: Ubuntu 20.04.3 LTS (x86_64)
GCC version: (Ubuntu 9.4.0-1ubuntu1~20.04.1) 9.4.0
Clang version: Could not collect
CMake version: version 3.25.0
Libc version: glibc-2.31
Python version: 3.10.8 (main, Nov 24 2022, 14:13:03) [GCC 11.2.0] (64-bit runtime)
Python platform: Linux-5.15.0-52-generic-x86_64-with-glibc2.31
Is CUDA available: True
CUDA runtime version: Could not collect
CUDA_MODULE_LOADING set to: LAZY
GPU models and configuration: GPU 0: NVIDIA GeForce RTX 3090
Nvidia driver version: 510.47.03
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_adv_infer.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_adv_train.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_infer.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_train.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_ops_infer.so.8.2.4
/usr/lib/x86_64-linux-gnu/libcudnn_ops_train.so.8.2.4
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True
Versions of relevant libraries:
[pip3] numpy==1.24.0rc2
[pip3] torch==1.14.0.dev20221206+cu116
[pip3] torch2trt==0.4.0
[pip3] torchtriton==2.0.0+0d7e753227
[conda] numpy 1.24.0rc2 pypi_0 pypi
[conda] torch 1.14.0.dev20221206+cu116 pypi_0 pypi
[conda] torch2trt 0.4.0 pypi_0 pypi
[conda] torchtriton 2.0.0+0d7e753227 pypi_0 pypi