Pull Command
docker pull intel/image-segmentation:tf-latest-3d-unet-fp32-inference
Description
This document has instructions for running 3D U-Net FP32 inference using Intel® Optimizations for TensorFlow*.
Dataset
The following instructions are based on BraTS2018 dataset preprocessing steps in the 3D U-Net repository.
-
Download BraTS2018 dataset. Please follow the steps to register and request the training and the validation data of the BraTS 2018 challenge.
-
Create a virtual environment and install the dependencies:
# create a python3.6 based venv virtualenv --python=python3.6 brats18_env . brats18_env/bin/activate # install dependencies pip install intel-tensorflow==1.15.2 pip install SimpleITK===1.2.0 pip install keras==2.2.4 pip install nilearn==0.6.2 pip install tables==3.4.4 pip install nibabel==2.3.3 pip install nipype==1.7.0 pip install numpy==1.16.3
Install ANTs N4BiasFieldCorrection and add the location of the ANTs binaries to the PATH environmental variable:
wget https://github.com/ANTsX/ANTs/releases/download/v2.1.0/Linux_Debian_jessie_x64.tar.bz2 tar xvjf Linux_Debian_jessie_x64.tar.bz2 cd debian_jessie export PATH=${PATH}:$(pwd)
-
Clone the 3D U-Net repository, and run the script for the dataset preprocessing:
git clone https://github.com/ellisdg/3DUnetCNN.git cd 3DUnetCNN git checkout update_to_brats18 # add the repository directory to the PYTHONPATH system variable export PYTHONPATH=${PWD}:$PYTHONPATH
After downloading the dataset file
MICCAI_BraTS_2018_Data_Training.zip
(from step 1), place the unzipped folders in thebrats/data/original
directory.# extract the dataset mkdir -p brats/data/original && cd brats unzip MICCAI_BraTS_2018_Data_Training.zip -d data/original # import the conversion function and run the preprocessing: python >>> from preprocess import convert_brats_data >>> convert_brats_data("data/original", "data/preprocessed") # run training using the original UNet model to get `validation_ids.pkl` created in `brats` directory. python train.py
After it finishes, set an environment variable to the path that contains the preprocessed dataset file validation_ids.pkl
.
export DATASET_DIR=/home/<user>/3DUnetCNN/brats
Quick Start Scripts
Script name | Description |
---|---|
fp32_inference | Runs inference with a batch size of 1 using the BraTS dataset and a pretrained model |
Docker
The model container includes the scripts and libraries needed to run 3D U-Net FP32 inference. Prior to running the model in docker, follow the instructions above for downloading the BraTS dataset.
-
Download the pretrained model from the 3DUnetCNN repo. In this example, we are using the "Original U-Net" model, trained using the BraTS 2017 data.
-
To run one of the quickstart scripts using the model container, you'll need to provide volume mounts for the dataset, the directory where the pretrained model has been downloaded, and an output directory.
DATASET_DIR=<path to the BraTS dataset> PRETRAINED_MODEL_DIR=<directory where the pretrained model has been downloaded> OUTPUT_DIR=<directory where log files will be written> docker run \ --env DATASET_DIR=${DATASET_DIR} \ --env OUTPUT_DIR=${OUTPUT_DIR} \ --env PRETRAINED_MODEL=${PRETRAINED_MODEL_DIR}/tumor_segmentation_model.h5 \ --env http_proxy=${http_proxy} \ --env https_proxy=${https_proxy} \ --volume ${DATASET_DIR}:${DATASET_DIR} \ --volume ${OUTPUT_DIR}:${OUTPUT_DIR} \ --volume ${PRETRAINED_MODEL_DIR}:${PRETRAINED_MODEL_DIR} \ --privileged --init -t \ intel/image-segmentation:tf-latest-3d-unet-fp32-inference \ /bin/bash quickstart/fp32_inference.sh
Documentation and Sources
Get Started
Docker Repo
Main GitHub
Readme
Release Notes
Get Started Guide
Code Sources
Dockerfile
Report Issue
License Agreement
LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the license file for additional details.