AN 993: Using Custom Models with Intel® FPGA AI Suite

ID 777190
Date 5/01/2023
Public

1. Introduction

This application note outlines how to take a custom or unsupported model from a supported framework and use with the OpenVINO™ toolkit and Intel® FPGA AI Suite. The document briefly covers supported frameworks, layers, and common issues encountered when using a custom or unsupported model.

This document also provides a step-by-step example of two different models. The first model is ResNet18 with its last Fully Connected Layer removed. The second example will show the addition of supported layers to an MLP model.

About the Intel® FPGA AI Suite Documentation Library

Documentation for the Intel® FPGA AI Suite is split across a few publications. Use the following table to find the publication that contains the Intel® FPGA AI Suite information that you are looking for:
Table 1.   Intel® FPGA AI Suite Documentation Library
Title and Description  
Release Notes

Provides late-breaking information about the Intel® FPGA AI Suite including new features, important bug fixes, and known issues.

Link
Getting Started Guide

Get up and running with the Intel® FPGA AI Suite by learning how to initialize your compiler environment and reviewing the various design examples and tutorials provided with the Intel® FPGA AI Suite

Link
IP Reference Manual

Provides an overview of the Intel® FPGA AI Suite IP and the parameters you can set to customize it. This document also covers the Intel® FPGA AI Suite IP generation utility.

Link
Compiler Reference Manual

Describes the use modes of the graph compiler (dla_compiler). It also provides details about the compiler command options and the format of compilation inputs and outputs.

Link
PCIe-based Design Example User Guide

Describes the design and implementation for accelerating AI inference using the Intel® FPGA AI Suite, Intel® Distribution of OpenVINO™ Toolkit, and an Intel® PAC with Intel® Arria® 10 GX FPGA or a Terasic* DE10-Agilex Development Board.

Link
SoC-based Design Example User Guide

Describes the design and implementation for accelerating AI inference using the Intel® FPGA AI Suite, Intel® Distribution of OpenVINO™ Toolkit, and an Intel® Arria® 10 SX SoC FPGA Development Kit.

Link