Description
This document provides links to step-by-step instructions on how to leverage Model Zoo docker containers to run optimized open-source Deep Learning Training and Inference workloads using TensorFlow framework on 4th Generation Intel® Xeon® Scalable processors.
Note: The containers below are based on pre-production build for Intel® Optimization for TensorFlow* and are for customer preview only and are not intended for use in production.
Use cases
The tables below provide links to run each use case using docker containers. The model scripts run on Linux.
Image Recognition
Model | Documentation Link | Dataset |
---|---|---|
ResNet 50v1.5 | Training | ImageNet 2012 |
ResNet 50v1.5 | Inference | ImageNet 2012 |
MobileNet V1* | Inference | ImageNet 2012 |
Image Segmentation
Model | Documentation Link | Dataset |
---|---|---|
3D U-Net MLPerf* | Inference | BRATS 2019 |
Object Detection
Model | Documentation Link | Dataset |
---|---|---|
SSD-ResNet34 | Training | COCO 2017 |
SSD-ResNet34 | Inference | COCO 2017 |
SSD-MobileNet* | Inference | COCO 2017 |
Language Modeling
Model | Documentation Link | Dataset |
---|---|---|
BERT large | Training | SQuAD and MRPC |
BERT large | Inference | SQuAD |
Language Translation
Model | Documentation Link | Dataset |
---|---|---|
Transformer_LT_mlperf* | Training | WMT English-German dataset |
Transformer_LT_mlperf* | Inference | WMT English-German dataset |
Recommendation
Model | Documentation Link | Dataset |
---|---|---|
DIEN | Training | DIEN dataset |
DIEN | Inference | DIEN dataset |
Documentation and Sources
Get Started Code Sources
License Agreement
LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.