Article ID: 000059580 Content Type: Troubleshooting Last Reviewed: 03/07/2023

Unable to Get Mean Average Precision (mAP) Result When Running Accuracy_check in Post-training Optimization Tool (POT)

BUILT IN - ARTICLE INTRO SECOND COMPONENT
Summary

How to use accuracy checker for POT

Description
  1. Ran POT command:

    pot -c yolov4-tiny_voc.json --output-dir backup -e
    Output: INFO:app.run.detection_accuracy:0.0

  2. Ran Accuracy Checker command: accuracy_check -c yolov4-tiny_voc.yml -td CPU give the following result:

    accuracy_checker WARNING: /opt/intel/openvino_2021/deployment_tools/open_model_zoo/tools/accuracy_checker/accuracy_checker/metrics/detection.py:201: UserWarning: No detections to compute mAP
    warnings.warn("No detections to compute mAP")

    map: 0.00%
    AP@0.5: 0.00%
    AP@0.5:0.05:95: 0.00%

Resolution

The Visual Object Classes Challenge (VOC) dataset is not validated by Intel. Intel has validated the accuracy using the Common Objects in Context (COCO) dataset as mentioned in Yolo-v4-tf documentation. By using coco_precision to calculate the mAP for the non-COCO dataset, this might not give the best result.

To prevent getting 0.00% for mAP value during accuracy checker execution, change from VOC to MSCOCO dataset and use different metrics such as detection_accuracy that work with DetectionAnnotation representation.

Refer to How to Run Examples for steps to perform accuracy check on models.

Related Products

This article applies to 1 products