Exploring Cloud Native MLOps for GenAI Vector Search

Presented by: Mary Grygleski
Time: Thursday, Jan. 11, 9:15 AM - 10:15 AM

ChatGPT has been in the center stage since early this year. We will first take a look into this exciting sub-new field of Generative AI, and understand what LLM and NLP are, and the challenges that all of these are presenting themselves. We will also highlight the importance of Vector Search, and what a Vector DB's role is helping with the embeddings and fast index pattern matches/searches. While it is exciting, we also need to ensure that the process of building, integrating, and continuous deployment are being handled in the most efficient way, and by leveraging the cloud native environment with Kubernetes, we will examine how the process can be optimized by leveraging on the serverless and event-driven nature of a typical cloud native environment.

MLOps—machine learning operations, or DevOps for machine learning—is the intersection of people, process, and platform for gaining business value from machine learning. It streamlines development and deployment via monitoring, validation, and governance of machine learning models. With the rapid rise in popularity in GenAI, we will explore how the operational side of things will be impacted and what MLOps will differ from DevOps.

Room: Salon BTags: DevOps, CI/CD, Machine LearningLevel: Introductory and overview