The information provided in this paper describes how to build and install TensorFlow* Serving, a high-performance serving system for machine learning models designed for production environments.
The installation guidelines presented in this document are distilled from information available on the TensorFlow Serving GitHub website. The steps outlined below are provided to give a quick overview of the installation process; however, since third-party information is subject to change over time it is recommended that you also review the information provided on the TensorFlow Serving website.
Important: The step-by-step guidelines provided below assumes the reader has already completed the Intel® Optimization for TensorFlow* Installation Guide, which includes the steps to install the Bazel* build tool and some of the other required dependencies not covered here.
Begin by installing the Google Protocol RPC* library (gRPC*), a framework for implementing remote procedure call (RPC) services.
sudo pip install grpcio
Next, ensure the other TensorFlow Serving dependencies are installed by issuing the following command:
sudo apt-get update && sudo apt-get install -y \
Installing TensorFlow* Serving
Clone TensorFlow Serving from the GitHub repository by issuing the following command:
git clone --recurse-submodules https://github.com/tensorflow/serving
The serving/tensorflow directory created during the cloning process contains a script named “configure” that must be executed to identify the pathname, dependencies, and other build configuration options. For TensorFlow optimized on Intel architecture, this script also allows you to set up Intel® Math Kernel Library (Intel® MKL) related environment settings. Issue the following commands:
Important: Select ‘Y’ to build TensorFlow with MKL support, and ‘Y’ to download MKL LIB from the web. Select the default settings for the other configuration parameters.
bazel build --config=mkl --copt="-DEIGEN_USE_VML" tensorflow_serving/...
Testing the Installation
Test the TensorFlow Serving installation by issuing the following command:
bazel test tensorflow_serving/...
If everything worked OK you should see results similar to Figure 1.
Figure 1. TensorFlow Serving installation test results.
The next article describes how to train and save a TensorFlow model, host the model in TensorFlow Serving, and use the model for inference in a client-side application.
Product and Performance Information
Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.