How to use accuracy checker for POT
- Ran POT command:
pot -c yolov4-tiny_voc.json --output-dir backup -e
Output: INFO:app.run.detection_accuracy:0.0 - Ran Accuracy Checker command: accuracy_check -c yolov4-tiny_voc.yml -td CPU give the following result:
accuracy_checker WARNING: /opt/intel/openvino_2021/deployment_tools/open_model_zoo/tools/accuracy_checker/accuracy_checker/metrics/detection.py:201: UserWarning: No detections to compute mAP
warnings.warn("No detections to compute mAP")map: 0.00%
AP@0.5: 0.00%
AP@0.5:0.05:95: 0.00%
The Visual Object Classes Challenge (VOC) dataset is not validated by Intel. Intel has validated the accuracy using the Common Objects in Context (COCO) dataset as mentioned in Yolo-v4-tf documentation. By using coco_precision to calculate the mAP for the non-COCO dataset, this might not give the best result.
To prevent getting 0.00% for mAP value during accuracy checker execution, change from VOC to MSCOCO dataset and use different metrics such as detection_accuracy that work with DetectionAnnotation representation.
Refer to How to Run Examples for steps to perform accuracy check on models.