OpenVINO™ Inference Time Increases When Running Multiple Processes
Content Type: Product Information & Documentation | Article ID: 000058227 | Last Reviewed: 03/06/2026
The inferencing time doubles when running two processes to infer the same model.
| Note | The ov::hint::enable_cpu_pinning property replaced the legacy CONFIG_KEY(CPU_BIND_THREAD) parameter starting from OpenVINO™ 2024.0. |
Refer to Performance Hints and Thread Scheduling for more information.