✨ Thank you all for participating in GOSIM AI Paris! ✨ See you in September at GOSIM Hangzhou! ✨
Filter

AI Infra

Make Your LLMs Serverless

May 6

14:00 - 14:40

Location: Founders Café (Updated)

LLMs require GPUs, causing scarcity. Overprovisioning them is expensive and a waste. Google Cloud Run now offers serverless GPU support, enabling cost-effective LLM deployment. A live demo will compare Gemma model performance with and without GPUs.

Speakers