Webinar: LLM-Powered Browser Plug-In
Subscribe Now
Stay in the know on all things CODE. Updates are delivered to your inbox.
Overview
Build a large language model (LLM) application using the power of AI PC processing, tapping the native capabilities of Intel® Core™ Ultra processors for running AI locally. The session shows how to develop a Python* back end with a browser extension that compactly summarizes web page content. The exercise showcases the hardware and software from Intel that makes it possible to run LLMs locally.
This expert-led session demystifies the complexity of combining Python back ends with front ends like browser extensions, enabling smooth, real-time AI processing. The Intel Core Ultra processors expand the possibilities of AI in new directions by enabling developers to implement local AI applications effectively.
This intermediate-level session focuses on AI and machine learning engineers, software developers, and data scientists.
The session covers these topics:
- Find out how to efficiently develop browser extensions.
- Tap Python’s capabilities and integrate a Python back end with a browser extension as a front end.
- Use Intel Core Ultra processors within AI PCs to generate generative AI applications.