Build End-to-End GenAI Use Cases with OPEA on Intel® Gaudi® Accelerators
Subscribe Now
Stay in the know on all things CODE. Updates are delivered to your inbox.
Overview
Open Platform for Enterprise AI (OPEA) is a collaborative project for driving interoperability across a diverse and heterogeneous ecosystem to accelerate business-ready, secure, cost-effective generative AI (GenAI) deployments. Launched by The Linux Foundation* and Intel, OPEA includes ecosystem innovations from Hugging Face*, KX*, MariaDB* Foundation, MinIO*, Qdrant, Red Hat*, SAS*, and more.
This session focuses on the OPEA GenAI Components (GenAIComps) project, a suite of compatible, open source microservices that facilitate the creation and customization of end-to-end (E2E) GenAI applications across various media types (text, audio, images, and video) while using the capabilities of Intel® Gaudi® AI accelerators.
This session includes:
- GenAIComps architecture and how to use it to build customized E2E use cases for diverse enterprise needs
- Step-by-step guidance on developing and deploying tailored microservices within the project’s ecosystem
- The software tools needed to do it
- A demonstration of how to build a Visual RAG system with GenAIComps
Skill level: Intermediate
You May Also Like
Related Articles