import itertools
import onnx
model = onnx.hub.load('alexnet', opset=9)
onnx.checker.check_model(model)
model = onnx.shape_inference.infer_shapes(model, check_type=True, strict_mode=True)
value_infos = {
vi.name: vi
for vi in itertools.chain(model.graph.value_info, model.graph.output)
}
for node in model.graph.node:
for i, output in enumerate(node.output):
if node.op_type == "Dropout" and i != 0:
continue
assert output in value_infos
tt = value_infos[output].type.tensor_type
assert tt.elem_type != onnx.TensorProto.UNDEFINED
for dim in tt.shape.dim:
assert dim.WhichOneof("value") == "dim_value"
Y. Users cannot get real models from onnx.backend anymore.
Any feedback is welcome.
System information
Latest
What is the problem that this feature solves?
pytestunder onnx repo. It's a burden that onnx downloads the large models every time in a fresh environment.Alternatives considered
Describe the feature
Will this influence the current api (Y/N)?
Y. Users cannot get real models from onnx.backend anymore.
Feature Area
test
Are you willing to contribute it (Y/N)
Yes
Notes
Any feedback is welcome.