3D U-Net FP32 Inference for TensorFlow* Model Package

Published: 02/09/2021  

Last Updated: 06/15/2022

Download Command

wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_3_0/3d-unet-fp32-inference.tar.gz


This document has instructions for running 3D U-Net FP32 inference using Intel® Optimizations for TensorFlow*.

Follow the instructions at the 3D U-Net repository for downloading and preprocessing the BraTS dataset. The directory that contains the preprocessed dataset files will be passed to the launch script when running the benchmarking script.


The following instructions are based on BraTS2018 dataset preprocessing steps in the 3D U-Net repository.

  1. Download BraTS2018 dataset. Please follow the steps to register and request the training and the validation data of the BraTS 2018 challenge.

  2. Create a virtual environment and install the dependencies:

    # create a python3.6 based venv
    virtualenv --python=python3.6 brats18_env
    . brats18_env/bin/activate
    # install dependencies
    pip install intel-tensorflow==1.15.2
    pip install SimpleITK===1.2.0
    pip install keras==2.2.4
    pip install nilearn==0.6.2
    pip install tables==3.4.4
    pip install nibabel==2.3.3
    pip install nipype==1.7.0
    pip install numpy==1.16.3

    Install ANTs N4BiasFieldCorrection and add the location of the ANTs binaries to the PATH environmental variable:

    wget https://github.com/ANTsX/ANTs/releases/download/v2.1.0/Linux_Debian_jessie_x64.tar.bz2
    tar xvjf Linux_Debian_jessie_x64.tar.bz2
    cd debian_jessie
    export PATH=${PATH}:$(pwd)
  3. Clone the 3D U-Net repository, and run the script for the dataset preprocessing:

    git clone https://github.com/ellisdg/3DUnetCNN.git
    cd 3DUnetCNN
    git checkout update_to_brats18
    # add the repository directory to the PYTHONPATH system variable

    After downloading the dataset file MICCAI_BraTS_2018_Data_Training.zip (from step 1), place the unzipped folders in the brats/data/original directory.

    # extract the dataset
    mkdir -p brats/data/original && cd brats
    unzip MICCAI_BraTS_2018_Data_Training.zip -d data/original
    # import the conversion function and run the preprocessing:
    >>> from preprocess import convert_brats_data
    >>> convert_brats_data("data/original", "data/preprocessed")
    # run training using the original UNet model to get `validation_ids.pkl` created in `brats` directory.
    python train.py 

After it finishes, set an environment variable to the path that contains the preprocessed dataset file validation_ids.pkl.

export DATASET_DIR=/home/<user>/3DUnetCNN/brats

Quick Start Scripts

Script name Description
fp32_inference Runs inference with a batch size of 1 using the BraTS dataset and a pretrained model

Bare Metal

To run on bare metal, the following prerequisites must be installed in your environment:

  • Python 3
  • intel-tensorflow==1.15.2
  • numactl
  • Keras==2.2.4
  • numpy==1.16.1
  • nilearn==0.6.2
  • tables==3.4.4
  • nibabel==2.3.3
  • SimpleITK===1.2.0

Follow the instructions above for downloading the BraTS dataset.

  1. Download the pretrained model from the 3DUnetCNN repo. In this example, we are using the "Original U-Net" model, trained using the BraTS 2017 data.

  2. Download and untar the model package. Set environment variables for the path to your DATASET_DIR, PRETRAINED_MODEL and OUTPUT_DIR (where log files will be written), and then run the quickstart script.

    wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v2_3_0/3d-unet-fp32-inference.tar.gz
    tar -xzf 3d-unet-fp32-inference.tar.gz
    cd 3d-unet-fp32-inference
    export DATASET_DIR=<path to the BraTS dataset>
    export PRETRAINED_MODEL=<Path to the downloaded tumor_segmentation_model.h5 file>
    export OUTPUT_DIR=<directory where log files will be written>

Documentation and Sources

Get Started​
Main GitHub
Release Notes
Get Started Guide

Code Sources
Report Issue

License Agreement

LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.

Related Containers and Solutions

3D UNet FP32 Inference TensorFlow* Container

View All Containers and Solutions 🡢

Product and Performance Information


Performance varies by use, configuration and other factors. Learn more at www.Intel.com/PerformanceIndex.