AI and edge computing are the new killer apps for business transformation in many industries.
Many organizations are experimenting with edge AI stacks on Kubernetes to unlock new customer experiences and streamline operational processes.
But in such a fast-moving space as AI, it can be hard to separate fact from fiction and work out not just what’s actually possible, but the best way to make it happen.
In this webinar, technology fellow, Cornelia Davis and CTO, Saad Malik will walk you through:
- What edge AI means, and how it relates to terms like AIops and MLops
- Why the edge is the natural home for critical AI inferencing workloads
- The top challenges for edge computing — and why adding AI models and engines to the mix exacerbates these problems
- Key requirements for successful AI deployments in edge environments, from security to lifecycle management
- Which tools and projects you need to know, from Kubeflow to LocalAI
We’ll also introduce you to Palette EdgeAI, a new enhancement to Spectro Cloud Palette that simplifies the deployment and management of AI stacks in your Kubernetes-based edge computing environments.
You’ll walk away with a deeper understanding of edge AI and a clear direction for your edge computing projects.
Cornelia has spent a career in emerging tech, starting with image processing, moving to web-centric computing, and then cloud-native software and DevOps platforms. After helping to bring Cloud Foundry to the industry, she turned her attention to Kubernetes-based platforms, pushing even further into cloud-native operational practices with things like GitOps. Cornelia is the author of Cloud Native Patterns: Designing change tolerant software.
Saad is passionate about building products in the areas of cloud, virtualization, containers, and distributed systems. In his fifteen years of experience, Saad has shipped multiple new products in enterprise, service provider, and consumer technologies. He is a hardcore Trekkie and enjoys building autonomous drone tracking software.