This document provides links to step-by-step instructions on how to leverage Model Zoo docker containers to run optimized open-source Deep Learning Training and Inference workloads using TensorFlow framework on 4th Generation Intel® Xeon® Scalable processors.
Note: The containers below are based on pre-production build for Intel® Optimization for TensorFlow* and are for customer preview only and are not intended for use in production.
The tables below provide links to run each use case using docker containers. The model scripts run on Linux.
|ResNet 50v1.5||Training||ImageNet 2012|
|ResNet 50v1.5||Inference||ImageNet 2012|
|MobileNet V1*||Inference||ImageNet 2012|
|3D U-Net MLPerf*||Inference||BRATS 2019|
|BERT large||Training||SQuAD and MRPC|
|Transformer_LT_mlperf*||Training||WMT English-German dataset|
|Transformer_LT_mlperf*||Inference||WMT English-German dataset|
Documentation and Sources
Get Started Code Sources
LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.