3.3. Design Example Software Components
The design examples contain a sample software stack for the runtime flow.
- OpenVINO™ Toolkit (Inference Engine, Heterogeneous Plugin)
- FPGA AI Suite runtime plugin
- Vendor-provided FPGA board driver or OPAE driver (depending on the design example)
The design example contains the source files and Makefiles to build the FPGA AI Suite runtime plugin. The OpenVINO™ component (and OPAE components, where used) is external and must be manually preinstalled.
A separate flow compiles the AI network graph using the FPGA AI Suite compiler, as shown in figure Software Stacks for FPGA AI Suite Inference that follows as the Compilation Software Stack.
The compilation flow output is a single binary file called CompiledNetwork.bin that contains the compiled network partitions for FPGA and CPU devices along with the network weights. The network is compiled for a specific FPGA AI Suite architecture and batch size. This binary is created on-disk only when using the Ahead-Of-Time flow; when the JIT flow is used, the compiled object stays in-memory only.
An Architecture File describes the FPGA AI Suite IP architecture to the compiler. You must specify the same Architecture File to the FPGA AI Suite compiler and to the FPGA AI Suite design example utility (dla_build_example_design.py).
The runtime flow accepts the CompiledNetwork.bin file as the input network along with the image data files.
The runtime stack cannot program the FPGA with a bitstream. To build a bitstream and program the FPGA devices:
- Compile the design example.
- Program the device with the bitstream.
Instructions for these steps are provided in the sections for each design example.
To run inference through the OpenVINO™ Toolkit on the FPGA, set the OpenVINO™ device configuration flag (used by the heterogeneous Plugin) to FPGA or HETERO:FPGA,CPU.