Skip To Main Content
Support Knowledge Base

Is It Possible to Convert from Intermediate Representation (IR) Format Back to Open Neural Network Exchange* (ONNX*), or Other, File Format Using Model Optimizer?

Content Type: Product Information & Documentation   |   Article ID: 000058526   |   Last Reviewed: 08/03/2022

Description

ONNX* model can be converted into IR format using Model Optimizer tool. Unable to validate if there is a way to convert from IR format back to ONNX* file format.

Resolution

The OpenVINO™ workflow does not support converting from IR format back to ONNX*, or other, file format. Model Optimizer loads a model into memory, reads it, builds the internal representation of the model, optimizes it, and produces the IR format, which is the only format that the Inference Engine accepts.

Related Products

This article applies to 4 products.
Intel® Xeon Phi™ Processor Software OpenVINO™ toolkit Performance Libraries

Discontinued Products

Intel® Developer Cloud for the Edge