Skip to content

convert_onnx_models_to_ort doesn't work on large_model onnx files (zip with tensor values stored externally) #14697

@josephrocca

Description

@josephrocca

Describe the issue

I've got a ~3GB model produced using tf2onnx.convert.from_function(..., large_model=True, opset=16).

Here's what the large_model option does:

large_model: When set, creates a zip file containing the ONNX protobuf model and large tensor values stored externally. This allows for converting models whose size exceeds the 2 GB.

The onnx file that is produced can't be converted with onnxruntime.tools.convert_onnx_models_to_ort due to an error - I think it expects a 'normal' onnx file.

To reproduce

  1. Download this: https://drive.google.com/file/d/1--6efTeiLoCCeTifDzABpHySmkJ4Bzkg/view?usp=sharing
  2. Run this: python -m onnxruntime.tools.convert_onnx_models_to_ort scheduler_loop_body.onnx

And you'll get this error:

Converting models with optimization style 'Fixed' and level 'all'
Converting optimized ONNX model /content/scheduler_loop_body.onnx to ORT format model /content/scheduler_loop_body.ort
Error converting /content/scheduler_loop_body.onnx: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /content/scheduler_loop_body.onnx failed:Protobuf parsing failed.
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.8/dist-packages/onnxruntime/tools/convert_onnx_models_to_ort.py", line 363, in <module>
    convert_onnx_models_to_ort()
  File "/usr/local/lib/python3.8/dist-packages/onnxruntime/tools/convert_onnx_models_to_ort.py", line 302, in convert_onnx_models_to_ort
    converted_models = _convert(
  File "/usr/local/lib/python3.8/dist-packages/onnxruntime/tools/convert_onnx_models_to_ort.py", line 159, in _convert
    _ = ort.InferenceSession(
  File "/usr/local/lib/python3.8/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 360, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/usr/local/lib/python3.8/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 397, in _create_inference_session
    sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /content/scheduler_loop_body.onnx failed:Protobuf parsing failed.

Urgency

Blocks conversion of all models >2GB.

Platform

Linux

OS Version

Ubuntu 18.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.14.0

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    converterrelated to ONNX convertersstaleissues that have not been addressed in a while; categorized by a bot

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions