Develop fast, performant Python* code with this set of essential computational packages including NumPy, SciPy, and more.
Take advantage of the most popular and fastest growing programming language with underlying instruction sets optimized for Intel® architectures.
Achieve near-native performance through acceleration of core Python numerical and scientific packages that are built using Intel® Performance Libraries.
Achieve highly efficient multithreading, vectorization, and memory management, and scale scientific computations efficiently across a cluster.
Core packages include Numba, NumPy, SciPy, and more.
Develop for Accelerated Compute
Data Parallel Python technologies enable a standards-based development model for accelerated computing across XPUs that interoperates with the Python ecosystem without using low-level proprietary programming APIs.
Data Parallel Control library provides data and device management across Intel® XPUs.
Data Parallel NumPy library is a drop-in replacement of NumPy enabling execution across Intel XPUs.
Extension for Numba enables compiler support of Intel XPUs.
Who Needs This Product
Numerical and Scientific Computing Developers
Accelerate and scale the compute-intensive Python packages NumPy, SciPy, and mpi4py.
High-Performance Computing (HPC) Developers
Unlock the power of modern hardware to speed up your Python applications.
Accelerated Python Data Science & Machine Learning
If you are a machine learning developer or data scientist, you can optimize your scikit-learn and XGBoost algorithms on Intel® architecture using the Intel® oneAPI AI Analytics Toolkit.
Develop in the Cloud
Get what you need to build and optimize your oneAPI projects for free. With an Intel® DevCloud account, you get 120 days of access to the latest Intel® hardware—CPUs, GPUs, FPGAs—and Intel® oneAPI tools and frameworks. No software downloads. No configuration steps. No installations.
Intel Distribution for Python is included as part of the Intel oneAPI AI Analytics Toolkit, which provides accelerated machine learning and data analytics pipelines with optimized deep-learning frameworks and high-performing Python libraries.