Article ID: 000093057 Content Type: Product Information & Documentation Last Reviewed: 12/20/2022

Is It Possible to Run TensorFlow* Lite Models with Intel® Neural Compute Stick 2 (Intel® NCS2)?

BUILT IN - ARTICLE INTRO SECOND COMPONENT
Summary

OpenVINO™ Model Optimizer does not support TensorFlow* Lite models to be converted to IR

Description

Unable to determine whether TensorFlow Lite models can be inferred on Intel® NCS2.

Resolution

It is only allowed to directly load ONNX* model or Intermediate Representation (IR) into OpenVINO™ Inference Engine (including MYRIAD). Whereas for Model Optimizer, it does not support .tflite file to IR format conversion natively.

In summary, OpenVINO™ Model Optimizer does not support conversion of Tensorflow Lite models to IR.

Related Products

This article applies to 1 products