IoT Developer Kits
Subscribe Now
Stay in the know on all things CODE. Updates are delivered to your inbox.
In this session, product manager Daniel Tsui discusses how Intel works with key partners to create various developer kits for different “families”: Foundation kits and Intel® Vision Accelerator Design kits.
Tune in to learn about these kits—what they are, where to get more information, and where you can purchase them.
Kick-start your targeted application development with a superior out-of-the-box experience.
- Built on prevalidated and emission-certified (FCC and CE) systems based on Intel® architecture
- Includes an integrated software stack with an operating system, drivers, SDKs, tools, libraries, and samples
- Provides acceleration hardware options that use the Intel® Distribution of OpenVINO™ toolkit on an Intel® Movidius™ Vision Processing Unit (VPU)
Intel® Vision Accelerator Design
Deploy power-efficient deep neural network inference for fast, accurate video analytics and computer vision applications.
- Enables high throughput and efficiency
- Works with the Intel Distribution of OpenVINO toolkit
- Run analytics on edge servers, appliances, and network video recorders
- Gain full use of Intel architecture across CPUs, CPUs with integrated graphics, FPGAs, and VPUs
Other Resources
- See all Developer-Ready Kits for IoT
- Download the latest version of the Intel Distribution of OpenVINO toolkit
- Find out more about the Intel Movidius VPU
Daniel Tsui
Product manager for the Foundational Developer Kit Program, Intel Corporation
Daniel works to enable developers with products that accelerate time-to-market solutions. During his 14-year tenure with the company, Daniel has held various positions including technical marketing engineer, senior validation engineer, and validation engineer.
Intel® Distribution of OpenVINO™ Toolkit
Deploy deep learning inference with unified programming models and broad support for trained neural networks from popular deep learning frameworks.
Ready, Steady, Stream: Introducing the Deep Learning Streamer in the Intel Distribution of OpenVINO Toolkit
Optimization Best Practices to Maximize Applications with the Intel Distribution of OpenVINO Toolkit
Microsoft Azure* and Open Neural Network Exchange (ONNX*) Runtime for Intel Distribution of OpenVINO Toolkit