Install the OpenVINO™ Toolkit for Raspbian* OS from a Docker* Image


Install & Setup



The Intel Distribution of OpenVINO toolkit makes it simple to adopt and maintain your code. The runtime (inference engine) allows you to tune for performance by compiling the optimized network and managing inference operations on specific devices. 

This guide provides users with steps for creating a Docker* image to install the OpenVINO™ toolkit for Raspbian* OS.

System requirements

Target operating system

  • Raspbian* Stretch, 32-bit
  • Raspbian* Buster, 32-bit

Host operating systems

  • Raspbian* Stretch, 32-bit
  • Raspbian* Buster, 32-bit


  • Raspberry Pi* board with ARM* ARMv7-A CPU architecture. Check that uname -m returns armv7l.
    • Raspberry Pi* 3 Model B+
    • Raspberry Pi* 4 Model B
  • Intel® Neural Compute Stick 2



Use the automated convenience scripts to install Docker*, as this is currently the only way to install this toolkit for Raspbian*. Find more information.

Building a Docker* image for Intel® Neural Compute Stick 2

Build image

To build a Docker* image, you'll need to create a Dockerfile that contains the defined variables and commands required to create an OpenVINO™ toolkit installation image.

Create your Dockerfile using the following example as a template.

  1. Create or go to a directory where you will create your Docker* image. This document creates a ~/docker directory.

    mkdir ~/docker && cd ~/docker

  2. Download the Dockerfile template (ZIP) from this guide or create your own Dockerfile using the contents of the template below.

    vi Dockerfile

    FROM balenalib/raspberrypi3:buster

    ARG INSTALL_DIR=/opt/intel/openvino

    RUN apt-get --allow-unauthenticated upgrade
    RUN apt-get update && apt-get install -y --no-install-recommends \
       apt-utils \
       automake \
       cmake \
       cpio \
       gcc \
       g++ \
       libatlas-base-dev \
       libstdc++6 \
       libtool \
       libusb-1.0.0-dev \
       lsb-release \
       make \
       python3-pip \
       python3-numpy \
       python3-scipy \
       libgtk-3-0 \
       pkg-config \
       libavcodec-dev \
       libavformat-dev \
       libswscale-dev \
       sudo \
       udev \
       unzip \
       vim \
       git \
       wget && \
       rm -rf /var/lib/apt/lists/*
    RUN mkdir -p $INSTALL_DIR && cd $INSTALL_DIR && \
       wget -c $DOWNLOAD_LINK && \
       tar xf l_openvino_toolkit_debian9_arm*.tgz --strip 1 -C $INSTALL_DIR
    # add USB rules
    RUN sudo usermod -a -G users "$(whoami)"
    # build Object Detection sample
    RUN /bin/bash -c "source $INSTALL_DIR/ && \
       cd $INSTALL_DIR/install_dependencies && \
    RUN echo "source /opt/intel/openvino/" >> ~/.bashrc && \
       mkdir /root/Downloads && \
       cd $INSTALL_DIR/samples/c/ && \
       /bin/bash -c "source $INSTALL_DIR/ && \
       ./ && \
       cd $INSTALL_DIR/samples/cpp/ && \
       ./ && \
       wget --no-check-certificate $BIN_FILE -O /root/Downloads/person-vehicle-bike-detection-crossroad-0078.bin && \
       wget --no-check-certificate $WEIGHTS_FILE -O /root/Downloads/person-vehicle-bike-detection-crossroad-0078.xml && \
       wget --no-check-certificate $IMAGE_FILE -O /root/Downloads/walk.jpg "



    You will need to replace the direct link to the OpenVINO™ toolkit package in the DOWNLOAD_LINK variable in the template above with that of the latest version. You can copy the link of the OpenVINO toolkit for Raspbian* OS package (i.e. l_openvino_toolkit_debian9_arm_*_x86_64.tgz) from Select the latest version available, right-click the URL, and press Copy link address.

  3. To build a Docker* image for Intel® Movidius™ Neural Compute Stick or Intel® Neural Compute Stick 2, run the following command:

    docker build . -t <image_name>

    (for example, docker build . -t openvino-rpi)

Running and testing a Docker* image

Known limitations:

  • The Intel® Neural Compute Stick 2 device changes its VendorID and DeviceID during execution and each time looks for a host system as a brand new device. This means it cannot be mounted as usual.
  • UDEV events aren't forwarded to the container by default, and so it isn't aware of device reconnection.
  • Only one device is supported per host.

Running Benchmark App

The application works with models in the OpenVINO IR (model.xml and model.bin) and ONNX (model.onnx) formats. Make sure to convert your models if necessary.

  1. Use the following option to run the image on an Intel® Neural Compute Stick 2. To run this container in interactive and privileged mode, enable Docker network configuration as host, and then mount all devices to the container:

    docker run -it --privileged -v /dev:/dev --network=host <image_name> /bin/bash

    (for example, docker run -it --privileged -v /dev:/dev --network=host openvino-rpi /bin/bash)

  2. Use the following commands to run benchmark app. Go to the build samples directory:

    cd /root/inference_engine_._samples_build/armv7l/Release/

  3. Run benchmarking with default options, use the following command with specifications for the model and a path to the input image:

    ./benchmark_app -m ~/Downloads/person-vehicle-bike-detection-crossroad-0078.xml -i ~/Downloads/walk.jpg -d MYRIAD

This completes the installation procedure for the OpenVINO™ toolkit for Raspbian* from a Docker* image.


This applies to 2022.2 release of OpenVINO™ toolkit.


Related topics
Building Open Model Zoo Demos on Raspberry Pi*
Workflow for Raspberry Pi*
Other OpenVINO™ toolkit code samples
OpenVINO™ toolkit Open Model Zoo
Optimize Networks for the Intel® Neural Compute Stick (Intel® NCS2) Device
Community Forum and Technical Support