Running AI Inference at the Edge – The Last Great Cloud Transformation EP2
Manage episode 450095749 series 3615409
Organizations typically train AI models in large, centralized data centers provided by public cloud providers, but to effectively use AI in applications, it’s crucial to run these models closer to users. While small AI models can run on devices like smartphones and laptops, larger models require a robust edge network, consisting of GPU-powered data centers worldwide, to deliver fast, AI-driven insights. A connectivity cloud facilitates this by transforming networking, allowing organizations to build, secure, and run AI models at the edge, access open-source AI models, and leverage a global edge network for responsive, engaging user experiences.
Hosts Alan Shimel and Mitch Ashley are joined by Rita Kozlov (Cloudflare) and Ranny Haiby (Linux Foundation) for a lively discussion on Techstrong.TV.
5 episodi