🎉 Try the GOSIM Ticket 🎟 Lucky Draw! ✨
Filter

AI Infra

Make Your LLMs Serverless

May 6

11:10 - 11:50

LLMs require GPUs, causing scarcity. Overprovisioning them is expensive and a waste. Google Cloud Run now offers serverless GPU support, enabling cost-effective LLM deployment. A live demo will compare Gemma model performance with and without GPUs.

Speakers