Wide & Deep FP32 Inference TensorFlow* Container

ID 679272
Updated 6/15/2022
Version Latest
Public

author-image

By

Pull Command

docker pull intel/recommendation:tf-latest-wide-deep-fp32-inference

Description

This document has instructions to run a Wide & Deep FP32 inference using Intel® Optimizations for TensorFlow*.

Download and preprocess the income census data by the following Python* script, which is a standalone version of census_dataset.py Please note that below program requires requests module to be installed. You can install is using pip install requests. Dataset will be downloaded in directory provided using --data_dir. If you are behind proxy then you can proxy urls using --http_proxy and --https_proxy arguments.

git clone https://github.com/IntelAI/models.git
cd models
python ./benchmarks/recommendation/tensorflow/wide_deep/inference/fp32/data_download.py --data_dir /home/<user>/widedeep_dataset

Quick Start Scripts

Script name Description
fp32_inference_online Runs wide & deep model inference online mode (batch size = 1)
fp32_inference_batch Runs wide & deep model inference in batch mode (batch size = 1024)

Docker*

When running in Docker, the Wide & Deep FP32 inference container includes the model package and TensorFlow model source repo, which is needed to run inference. To run the quickstart scripts, you'll need to provide volume mounts for the dataset and an output directory where log files will be written.

DATASET_DIR=<path to the Wide & Deep dataset directory>
OUTPUT_DIR=<directory where log files will be written>

docker run \
--env DATASET_DIR=${DATASET_DIR} \
--env OUTPUT_DIR=${OUTPUT_DIR} \
--env http_proxy=${http_proxy} \
--env https_proxy=${https_proxy} \
--volume ${DATASET_DIR}:${DATASET_DIR} \
--volume ${OUTPUT_DIR}:${OUTPUT_DIR} \
--privileged --init -t \
intel/recommendation:tf-latest-wide-deep-fp32-inference \
/bin/bash quickstart/<script name>.sh

Documentation and Sources

Get Started
Docker* Repository
Main GitHub*
Readme
Release Notes
Get Started Guide

Code Sources
Dockerfile
Report Issue

 


License Agreement

LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.


Related Containers and Solutions

Wide & Deep FP32 Inference TensorFlow* Model Package

View All Containers and Solutions 🡢