1. Using the FPGA AI Suite Docker Image Overview
Updated for: |
---|
Intel® FPGA AI Suite 2025.1 |
This application note is intended for AI and FPGA developers who want to install and run the FPGA AI Suite Docker* image with a Docker* client running on a Microsoft* Windows* system. The containerized FPGA AI Suite enables easy and quick access to the various tools in FPGA AI Suite.
This document also includes a quick-start tutorial that results in a ResNet-50 network running inference on an SoC FPGA device. The tutorial covers compiling an architecture with FPGA AI Suite compiler, generating an IP file, programming an SoC FPGA device, and running an example inference program on the device.
The quick-start tutorial included in this application note demonstrates the ease of use and ease of deployment of a deep learning model to the edge. The tutorial presents the Memory-to-Memory (M2M) execution model that provides a dla_benchmark interface to the inference engine, similar to the PCIe-based design examples. It also introduces the Streaming-to-memory (S2M) execution model. The S2M model uses the FPGA SoC ARM CPU as the streaming data source that streams data to a layout transform on the FPGA device. The S2M example design illustrates a recommended system architectures for any streaming source.
About the FPGA AI Suite Documentation Library
Title and Description | |
---|---|
Release Notes Provides late-breaking information about the FPGA AI Suite including new features, important bug fixes, and known issues. |
Link |
Getting Started Guide Get up and running with the FPGA AI Suite by learning how to initialize your compiler environment and reviewing the various design examples and tutorials provided with the FPGA AI Suite |
Link |
AN 1008: Using the FPGA AI Suite Docker** Image Describes how to install and run the FPGA AI Suite Docker* image with a Docker* client running on a Microsoft* Windows* system. The containerized FPGA AI Suite enables easy and quick access to the various tools in FPGA AI Suite. |
Link |
IP Reference Manual Provides an overview of the FPGA AI Suite IP and the parameters you can set to customize it. This document also covers the FPGA AI Suite IP generation utility. |
Link |
Compiler Reference Manual Describes the use modes of the graph compiler (dla_compiler). It also provides details about the compiler command options and the format of compilation inputs and outputs. |
Link |
Design Examples User Guide Describes the design and implementation for accelerating AI inference using the FPGA AI Suite, Intel® Distribution of OpenVINO™ toolkit, and various development boards (depending on the design example). |
Link |