Enabling Intel® MKL in PETSc applications

ID 672484
Updated 11/12/2017
Version Latest
Public

author-image

By

 

    PETSc(Portable, Extensible Toolkit for Scientific Computation) is an open source suite of data structures and routines for the parallel solution of scientific applications modelled by partial differential equations. Starting from release 3.8 PETSc users can benefit from enabling Intel® MKL sparse linear operations inside their application. Latest version of PETSc supports analogue for AIJ and BAIJ matrix formats that calls Intel® MKL Sparse BLAS Inspector Executor kernels for matrix vector multiplication.

    Inspector-Executor API for Sparse BLAS presented in MKL is a two-stage approach that divides all sparse operations into analysis and execution steps. During the initial analysis stage, the API inspects the matrix sparsity pattern, applies matrix structure changes and converts matrix to internal format. Internal matrix format is chosen based on sparsity pattern to enable better parallelism and vectorization on execution stage. In the execution stage, subsequent routine calls reuse this information in order to improve performance.

     With new update, PETSc users can easily switch to MKL Sparse BLAS Inspector-Executor API for sparse linear algebra operations and get performance benefit for most of PETSc solvers.

The following MKL functionality is currently supported in PETSc:

               MKL BLAS/LAPACK as basic linear algebra operations

               MKL PARDISO and Parallel Direct Sparse Solver for Cluster as direct solvers

               MKL Sparse BLAS IE for AIJ and BAIJ matrix operations (PETSc 3.8 version and later)

How to use PETSc with MKL:

  • Configure PETSc with MKL Sparse BLAS by adding --with-blas-lapack-dir=/path/to/mkl to configuration line. To use MKL_Pardiso, PETSc should be configured with --with-mkl_pardiso-dir=/path/to/mkl

Ex: ./configure --with-blas-lapack-dir=/path/to/mkl --with-mkl_pardiso-dir=/path/to/mkl

More information on PETSc installation can be found here.

  • When running PETSc application pass “-mat_type aijmkl” or “-mat_type baijmkl” to executable or set matrix type using MatSetType(A,MATAIJMKL)/ MatSetType(A,MATBAIJMKL)  call in source code. That will enable MKL Sparse BLAS IE as default package for all sparse matrix operations

Ex:      ./ex100 –mat_type aijmkl

Or in the source code:

  ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr);

  ierr = MatSetType(A,MATAIJMKL);CHKERRQ(ierr);

  ierr = MatSetUp(A);CHKERRQ(ierr);

  • To run PETSc application with MKL PARDISO as a direct solver run the code with                                                                     -pc_type lu -pc_factor_mat_solver_package mkl_pardiso.

For more information, see PETSc examples