Article ID: 000088387 Content Type: Compatibility Last Reviewed: 05/20/2022

Why Can Caffe Models Be Used Directly with OpenVINO™ Toolkit Inference Engine API?

BUILT IN - ARTICLE INTRO SECOND COMPONENT
Summary

Supported frameworks for OpenVINO™ Toolkit Inference Engine API

Description
  • Inferenced Caffe model directly on Intel® Neural Compute Stick 2 (Intel® NCS2).
  • Unable to determine why Caffe model can be used directly with OpenVINO™ Toolkit Inference Engine API without converting to Intermediate Representation (IR).
Resolution

Only limited Caffe frameworks can be used directly with the OpenVINO™ Toolkit Inference Engine API.

Currently, the frameworks supported by OpenVINO™ Toolkit Inference Engine API are:

  • Intermediate Representation (IR)
  • ONNX

Related Products

This article applies to 2 products