Intel® FPGA AI Suite: SoC Design Example User Guide

ID 768979
Date 9/06/2023
Public

A newer version of this document is available. Customers should click here to go to the newest version.

Document Table of Contents

6.3.3.2. Streaming System Inference Job Management

In a M2M system, the host CPU handles pushing jobs into the Intel® FPGA AI Suite IP job queue by writing to the IP registers. In the streaming configuration, this task is offloaded to the Nios® system and must be done in a coordinated way with the input buffer writing.

In streaming mode, the job-queue management is pushed to the mailbox instead of being managed by the host application. The job queue entry is then received by the Nios® processor. After an input buffer is written, the mSGDMA interrupts the Nios® processor, and the Nios® processor now pushes one job into the Intel® FPGA AI Suite IP.

For every buffer stored by the mSGDMA, the Nios® processor attempts to start another job.

For more details about the Nios® V Stream Controller and the mailbox communication protocol, refer to Streaming-to-Memory (S2M) Streaming Demonstration.