Learn about Intel® Developer Cloud for the Edge
February 2023: Intel is migrating to a new sign-in process to support enhanced authentication & security controls. Intel® Developer Cloud for the Edge will not be available from February 10, 5:00 p.m. to February 12, 9:00 a.m. Pacific time (GMT-8:00). Any jobs and projects running during this will be terminated, so they should be stopped prior. If you have questions or concerns, post a question in the forum.
With Intel® Developer Cloud for the Edge, you have complimentary access to:
- Discover and learn how to harness Intel’s AI solutions for edge workloads to develop your own software innovations. Intel Developer Cloud for the Edge contains interactive sample applications, containerized solutions, tutorials, and more.
- Prototype your own solution in JupyterLab and test workloads on bare metal machines, or build and launch your containerized solution.
- Optimize your software solutions for specific target edge devices using our optimization workflows or with the Deep Learning Workbench, a GUI-based pipeline for the Intel® Distribution of OpenVINO™ toolkit.
Two primary hardware architectures provide you with flexibility:
Interactive Prototyping & Benchmarking Environment Using JupyterLab on Bare Metal Hardware
The Intel Developer Cloud for the Edge lets you actively prototype and experiment with AI workloads on Intel® hardware. You get full access to hardware platforms hosted in Intel's cloud environment, which is designed specifically for deep learning.
- Test your model’s performance using the Intel Distribution of OpenVINO toolkit, as well as various CPU, GPU, and VPU combinations.
- Uses Jupyter* Notebooks, a browser-based development environment that enables you to run code from within your browser and to visualize results instantly.
- Use the series of Jupyter Notebook tutorials and preloaded examples to get started quickly.
- Implement deep learning applications to enable compelling, high-performance solutions.
- Take advantage of pretrained models, sample data, and executable code from the Intel Distribution of OpenVINO toolkit and other tools for deep learning.
- No hardware setup is required.
To get started:
- Use free development and infra nodes with JupyterLab.
- Create, tailor, and benchmark computer vision and edge AI applications with included tutorials.
- Try out the business-focused sample applications.
- Accelerate prototyping your own Python*-based software innovations by using code snippets and developer tools within JupyterLab.
AI Sample Applications
Training & Documentation
Container Playground Powered by the Red Hat* OpenShift* Platform
Launch containerized workloads on a wide range of Intel hardware using Kubernetes* based on the Red Hat* OpenShift* platform. The Container Playground is available at no cost to support you with developing, building, and testing cloud native container applications.
Features
Marketplace
Access prebuilt containerized applications to try out on Intel hardware. Then, use them as building blocks with your own AI solutions for the edge. You can also import your own containerized workloads.
- Sample Applications—Try out common solutions designed for vertical markets such as transportation, security, retail, and healthcare.
- Reference Implementations—Launch full-stack multicontainer workloads such as an Intelligent Traffic Management solution for cities.
- Configurable AI Solutions—Configure and launch multicamera workloads such as a Smart Retail Analytics solution designed to identify shopper behavior.
- OpenVINO™ Notebooks—Learn with the Intel Distribution of OpenVINO Toolkit.
Develop
Access a JupyterLab environment to run code samples featuring the Intel Distribution of OpenVINO Toolkit.
Build
Import your Dockerfiles, Helm* chart, or source code hosted in a Git repository, and build your container images directly from the application source or Dockerfile.
Test
Benchmark your existing container images imported from container registries. You can also test your multicontainer solutions with Docker Compose files and Helm charts.
- Import containers from a prebuilt library of sample applications or from your own code repository, and then test it on a wide range of Intel® architectures.
- Benchmark your code and optimize it for Intel hardware.
Continue Your Learning Journey with AI at the Edge
Optimize your machine learning solutions for Intel hardware.
Advance your career and get recognized for your new, marketable skills with Intel Edge AI Certification.