How to Assess Hardware Capabilities and Fit for Edge Inference Solutions

ID 773776
Updated 1/4/2021
Version Latest



Intel® DevCloud for the Edge


Intel provides a wide range of CPUs, GPUs, and VPUs that can be used to build an edge inference system. With Intel® DevCloud for the Edge, developers can test a variety of hardware and software configurations, benchmarking performance to identify potential development solutions. Check out this video to learn how it works and how it can help you to find the best hardware and software combination to build your edge inference solutions.


A few important features to try out with Intel® DevCloud for the Edge:

  • Run inference sample applications using the OpenVINO™ toolkit and inference engine on multiple hardware options available to compare performances. To experience how Intel® DevCloud for the Edge works, test the sample applications.
  • Estimate the inference performance of your deep learning models with the OpenVINO™ Benchmark Python* Tool. To see how it works, test the Benchmark App tutorial in the Advanced tutorials.
  • Learn to construct a media analytics pipeline using OpenVINO™ DL Streamer and run it on any available hardware with the DL Streamer tutorials
  • Prepare, tune, and profile AI models for optimal performance using the OpenVINO™ Deep Learning Workbench.

To register and get started with Intel® DevCloud for the Edge, go to the Intel® DevCloud for the Edge homepage.