Build and Run a Sample Using the Command Line
Build and Run a Sample Project
Name of Sample
How to Clone and Build
PyTorch Hello World
How to train a PyTorch model and run the inference with the Intel® Deep Neural Network Library (Intel® DNNL) enabled.
Clone PyTorch_HelloWorld, then follow the directions in README.md to build and run the sample.
TensorFlow Hello World
How TensorFlow optimized on Intel hardware enables Intel® DNNL calls by default. It implements an example neural network with one convolution layer and one ReLU layer.
Clone TensorFlow_HelloWorld, then follow the directions in README.md to build and run the sample.
Intel® Distribution of Modin* Getting Started
This Getting Started sample code shows how to use distributed Pandas using the Modin package.
After the AI Kit is installed with Conda, clone Intel® Distribution of Modin* Getting Started, then follow the directions in README.md to build and run the sample.
To get the Intel® Distribution of Modin*, you must install the AI Kit using the Conda* package manager.
Model Zoo for Intel® Architecture
Model Zoo for Intel® Architecture can be found in your installation of Intel® oneAPI AI Analytics Toolkit, typically found at
/opt/intel/oneapi/modelzoo/latest/models. Instructions for navigating the zoo, using the samples, and running the benchmarks are here: https://github.com/IntelAI/models/blob/v2.4.0/docs/general/tensorflow/AIKit.md#navigate-to-the-model-zoo
Intel® Neural Compressor
Intel® Neural Compressor is an open-source Python* library designed to help you quickly deploy low-precision inference solutions on popular deep-learning frameworks such as TensorFlow*, PyTorch*, MXNet*, and ONNX* (Open Neural Network Exchange) runtime.
Clone neural-compressor, then follow the directions in README.md to build and run the sample.
For more samples, browse the full GitHub repository: .
Build Your Own Project
. <install_dir>/setvars.sh conda activate tensorflow conda deactivate conda activate pytorch
conda activate root