FPGA AI Suite Handbook

ID 863373
Date 11/21/2025
Public
Document Table of Contents

10.2. Interfacing FPGA AI Suite IP an FPGA Design for a Typical System

In a typical FPGA AI Suite IP system, a host processor (either a PCIe* host or an embedded host) serves as the control path to the IP and does the following actions:
  • Orchestrating the input
  • Sets up the FPGA AI Suite IP
  • Reads the output of the FPGA AI Suite IP
  • Executes the Inference Engine runtime software (which imports and loads the compiled network parameters, weights, and biases contained in FPGA AI Suite runtime binary (.bin file)
Typically, a memory (either external interface to DDR or on-chip memory) is used to store or buffer the input vectors to the FPGA AI Suite IP. The FPGA AI Suite IP presents a DMA interface that can be programmed by the control path processor to manage and orchestrate the inference process at run time.
Figure 26.  FPGA AI Suite System Architecture