Gradient boosting is a popular machine-learning technique. It's used for regression and classification tasks in a myriad of real-world implementations, such as predicting the compressive strength of geological formations and the frequency and severity of infectious diseases.
Despite its popularity, training and inference for gradient boosting algorithms can become complex due to problems with handling large data sizes, irregular memory access, inefficient memory usage, and a multitude of other issues.
Enter XGBoost, a scalable, portable, and distributed machine-learning package for gradient boosted decision trees. It significantly speeds up model training and improves accuracy for better predictions.
This session shows how XGBoost optimizations for Intel® architecture, coupled with the Intel® oneAPI AI Analytics Toolkit, overcomes these hurdles. The video includes a live demonstration.
Get the Software
Download XGBoost optimized for Intel architecture as part of the Intel® oneAPI AI Analytics Toolkit—eight tools and frameworks to accelerate end-to-end data science and analytics pipelines.