Skip To Main Content
Intel logo - Return to the home page
My Tools

Select Your Language

  • Bahasa Indonesia
  • Deutsch
  • English
  • Español
  • Français
  • Português
  • Tiếng Việt
  • ไทย
  • 한국어
  • 日本語
  • 简体中文
  • 繁體中文
Sign In to access restricted content

Using Intel.com Search

You can easily search the entire Intel.com site in several ways.

  • Brand Name: Core i9
  • Document Number: 123456
  • Code Name: Emerald Rapids
  • Special Operators: “Ice Lake”, Ice AND Lake, Ice OR Lake, Ice*

Quick Links

You can also try the quick links below to see results for most popular searches.

  • Product Information
  • Support
  • Drivers & Software

Recent Searches

Sign In to access restricted content

Advanced Search

Only search in

Sign in to access restricted content.

The browser version you are using is not recommended for this site.
Please consider upgrading to the latest version of your browser by clicking one of the following links.

  • Safari
  • Chrome
  • Edge
  • Firefox

Deploy Smarter, Faster, More Responsive LLMs at the Edge

@IntelDevTools

Subscribe Now

Stay in the know on all things CODE. Updates are delivered to your inbox.

Sign Up

Overview

This session explores new techniques for running LLMs efficiently on client PCs and small-form-factor machines at the edge using the OpenVINO™ toolkit in combination with popular tools, libraries, and frameworks for model optimization and quantization.

Additionally, you’ll have the opportunity to gain practical experience by implementing a conversational AI voice agent using the OpenVINO toolkit and Gradio, an open source Python* package for quickly building a machine learning–based demo or web application.

This session provides:

  • Techniques for deploying advanced LLMs on edge devices for a variety of industries such as healthcare and manufacturing
  • How to optimize and quantize LLMs to ensure superior performance and low power consumption while reducing model size and computational demands
  • How to use AI Tools to create AI applications that are powerful and energy efficient for edge computing

Skill level: Expert

 

Featured Software Tools

OpenVINO toolkit

Optimum Intel library by Hugging Face*

 

Featured Code

Edge AI Reference Kits for OpenVINO Toolkit

Neural Network Compression Framework for OpenVINO Toolkit

Gradio on GitHub*

Gradio Quick Start

Jump to:

You May Also Like
 

   

You May Also Like

Related Articles

Essential AI Tools to Jumpstart AI Development Projects

How to Deploy AI Applications on AI PCs

Related Videos

Edge AI Reference Kit Demos

Explore AI PCs' Potential for Building Generative AI (GenAI) Solutions

Optimize Workloads for OpenVINO Toolkit at the Hardware Level

Build Next-Gen, Portable, Power-Efficient AI on an AI PC

Prototype and Deploy LLM Applications on Intel NPUs

  • Company Overview
  • Contact Intel
  • Newsroom
  • Investors
  • Careers
  • Corporate Responsibility
  • Inclusion
  • Public Policy
  • © Intel Corporation
  • Terms of Use
  • *Trademarks
  • Cookies
  • Privacy
  • Supply Chain Transparency
  • Site Map
  • Recycling
  • Your Privacy Choices California Consumer Privacy Act (CCPA) Opt-Out Icon
  • Notice at Collection

Intel technologies may require enabled hardware, software or service activation. // No product or component can be absolutely secure. // Your costs and results may vary. // Performance varies by use, configuration, and other factors. Learn more at intel.com/performanceindex. // See our complete legal Notices and Disclaimers. // Intel is committed to respecting human rights and avoiding causing or contributing to adverse impacts on human rights. See Intel’s Global Human Rights Principles. Intel’s products and software are intended only to be used in applications that do not cause or contribute to adverse impacts on human rights.

Intel Footer Logo