Overview
These specially designed kits for educators provide:
- Supplemental learning materials for deep learning and AI courses
- A comprehensive learning experience for students with instructional videos, hands-on labs, and real-world use cases
- Access to the latest developer tools and hardware from Intel
Benefits
Save Time
Expedite course design and development with well-defined learning objectives and ready-to-use lesson plans, teaching materials, video tutorials, hands-on labs, and assessments.
Get Hardware Access
Use Intel® DevCloud for hands-on labs and exercises, and enable students to prototype and experiment with AI workloads on the latest Intel® hardware.
Collaborate
Attend virtual educator workshops to connect with Intel product experts and peers, get questions answered, and share feedback.
Deploy Deep Learning Applications
Module 1: Introduction to AI and the Intel Distribution of OpenVINO Toolkit
Review AI concepts and use cases, as well as the fundamentals of deep learning, the objectives behind its use, and understand the challenges faced by developers during the development phase. Learn how to create deep learning applications seamlessly using the Intel® Distribution of OpenVINO™ toolkit. Explore the functions, features, and workflow for using Intel DevCloud.
Module 2: Optimization and Quantization of AI Models for Improved Performance
Using a practical example, learn about why it is crucial to optimize deep learning models for inference when developing deep learning applications. Explore in-depth the optimization tools included in the Intel Distribution of OpenVINO toolkit, including Model Optimizer, Post-Training Optimization Tool, and other supporting software.
Module 3: Creating scalable and future-ready AI applications with the Inference Engine
Get an introduction to the Inference Engine and learn how to use its streamlined workflow. Learn how to efficiently use all available compute resources using heterogeneous and multidevice plug-ins to boost the performance and hardware agnostic inference with a write once, deploy anywhere approach.
Module 4: Hardware Accelerators for Deep Learning
Explore the advantages and limitations of hardware platforms available for deep learning inference, including Intel® CPU, iGPU, and the Intel® Movidius™ Myriad™ X VPU platforms.
Module 5: Streamline AI Application Development and Deployment with Deep Learning Workbench
Learn about the various features of the Deep Learning Workbench. Use a sample application to review the features of the Intel Distribution of the OpenVINO toolkit.
Module 4: Hardware Accelerators for Deep Learning
Explore the advantages and limitations of hardware platforms available for deep learning inference, including Intel® CPU, iGPU, and the Intel® Movidius™ Myriad™ X VPU platforms.
Module 5: Streamline AI Application Development and Deployment with Deep Learning Workbench
Learn about the various features of the Deep Learning Workbench. Use a sample application to review the features of the Intel Distribution of the OpenVINO toolkit.