Wide & Deep FP32 Inference TensorFlow* Model Package

ID 672120
Updated 6/15/2022
Version Latest
Public

author-image

By

Download Command

wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_3_0/wide-deep-fp32-inference.tar.gz

Description

This document has instructions to run a Wide & Deep FP32 inference using Intel® Optimizations for TensorFlow*.

Download and preprocess the income census data by running the following Python* script, which is a standalone version of census_dataset.py Please note that below program requires requests module to be installed. You can install is using pip install requests. Dataset will be downloaded in directory provided using --data_dir. If you are behind proxy then you can proxy urls using --http_proxy and --https_proxy arguments.

git clone https://github.com/IntelAI/models.git
cd models
python ./benchmarks/recommendation/tensorflow/wide_deep/inference/fp32/data_download.py --data_dir /home/<user>/widedeep_dataset

Quick Start Scripts

Script name Description
fp32_inference_online Runs Wide & Deep model inference online mode (batch size = 1)
fp32_inference_batch Runs Wide & Deep model inference in batch mode (batch size = 1024)

Bare Metal

To run on bare metal, the following prerequisites must be installed in your environment:

  1. Download and untar the Wide & Deep FP32 inference model package:

    wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_3_0/wide-deep-fp32-inference.tar.gz
    tar -xvf wide-deep-fp32-inference.tar.gz
    
  2. Clone tensorflow/models as a tensorflow-models

    # We are going to use a branch based on older version of the tensorflow model repo.
    # Since, we need to to use logs utils on that branch, which were removed from 
    # the latest master
    git clone https://github.com/tensorflow/models.git tensorflow-models
    cd tensorflow-models
    git fetch origin pull/7461/head:wide-deep-tf2  
    git checkout wide-deep-tf2 
    
  3. Once your environment is setup, navigate back to the directory that contains the Wide & Deep FP32 inference model package, set environment variables pointing to your dataset and output directories, and then run a quickstart script.

    DATASET_DIR=<path to the Wide & Deep dataset directory>
    OUTPUT_DIR=<directory where log files will be written>
    TF_MODEL_SOURCE_DIR=<path to tensorflow-models>
    
    quickstart/<script name>.sh

Documentation and Sources

Get Started
Main GitHub*
Readme
Release Notes
Get Started Guide

Code Sources
Report Issue

 


License Agreement

LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.


Related Containers and Solutions

Wide & Deep FP32 Inference TensorFlow* Container

View All Containers and Solutions 🡢