Article ID: 000058516 Content Type: Error Messages Last Reviewed: 05/20/2022

Error: “Unsupported Layer Type FakeQuantize” When Inferencing INT8 Model with MYRIAD Plugin

Environment

Neural Compute Stick 2

BUILT IN - ARTICLE INTRO SECOND COMPONENT
Summary

MYRIAD plugin does not support INT8 models.

Description

Ran inference on face-detection-adas-0001(INT8) model downloaded from Open Model Zoo with the following results:

Inference results:

CPU plugin - successful

GPU plugin - successful

MYRIAD plugin - error unsupported layer type FakeQuantize

Resolution

FakeQuantize layers are added on activations and weights for most layers during the quantization process to INT8. The MYRIAD plugin does not support the FakeQuantize layer, hence, it cannot run inference on INT8 models.

Additional information

Supported Layers

Supported Model Formats

Related Products

This article applies to 2 products