-
Notifications
You must be signed in to change notification settings - Fork 707
Description
Here is the code using ortwrapper to inference on device
The model is maskrcnn trained through mmdet
Env:
torch1.8
cuda 11.1
onnx 1.8
from mmdeploy.backend.onnxruntime import ORTWrapper
import torch
onnx_file = 'end2end.onnx'
#gpu
model = ORTWrapper(onnx_file, 'cuda:3', ['dets', 'labels', 'masks'])
inputs = dict(input=torch.randn(1, 3, 768, 1344, device='cuda:3'))
outputs = model(inputs)
print(outputs)
no matter on host or device, it still got the error on reshape1588 node
[[21~2021-12-31 13:20:33.759103638 [E:onnxruntime:, sequential_executor.cc:339 Execute] Non-zero status code returned while running Reshape node. Name:'Reshape_1588' Status Message: /onnxruntime_src/onnxruntime/core/providers/cpu/tensor/reshape_helper.h:42 onnxruntime::ReshapeHelper::ReshapeHelper(const onnxruntime::TensorShape&, std::vector&, bool) gsl::narrow_cast<int64_t>(input_shape.Size()) == size was false. The input tensor cannot be reshaped to the requested shape. Input shape:{0,3}, requested shape:{1,3,3}
but works fine using ./tools/test.py, did not see differnece between using ORTwrapper directly.