FPGA AI Suite: Design Examples User Guide

ID 848957
Date 4/22/2025
Public

A newer version of this document is available. Customers should click here to go to the newest version.

Document Table of Contents

20.2. [SOC] Compiling Exported Graphs Through the FPGA AI Suite

The network as described in the .xml and .bin files (created by the Model Optimizer) is compiled for a specific FPGA AI Suite architecture file by using the FPGA AI Suite compiler.

The FPGA AI Suite compiler compiles the network and exports it to a .bin file with the format required by the OpenVINO™ Inference Engine. For instructions on how to compile the .xml and .bin files into AOT file suitable for use with the FPGA AI Suite IP, refer to [SOC] Compiling the Graphs

This .bin file created by the compiler contains the compiled network parameters for all the target devices (FPGA, CPU, or both) along with the weights and biases. The inference application imports this file at runtime.

The FPGA AI Suite compiler can also compile the graph and provide estimated area or performance metrics for a given architecture file or produce an optimized architecture file.

For the demonstration SD card, the FPGA bitstream has been built using one of the following IP architecture configuration files, so the architecture file for your development kit for compiling the OpenVINO™ Model:
  • Agilex™ 7 FPGA I-Series Transceiver-SoC Development Kit
    AGX7_Performance.arch
  • Arria® 10 SX SoC FPGA Development Kit
    A10_Performance.arch

For more details about the FPGA AI Suite compiler, refer to the FPGA AI Suite Compiler Reference Manual .