Skip to main content
LiteLLM is a library that simplifies multiple model providers. You can combine it with Restate to build resilient agents that can be customized to your use case. You wrap model calls in ctx.run and use Restate context actions to execute tools.
https://mintcdn.com/restate-6d46e1dc-main/khFAW59Lfp7wcDA1/img/ai/sdk-integrations/lite-llm_icon.webp?fit=max&auto=format&n=khFAW59Lfp7wcDA1&q=85&s=b828f72a983bb66693f72c61942d4653

Learn more about LiteLLM

Have a look at the following resources to get started with Restate and LiteLLM:
  • Examples: The examples which use only Restate to implement the agent logic, use LiteLLM to make model calls.
  • Guides: Check out the Restate Python examples to learn how to use Restate with LiteLLM.

Learn more

  • Blog: AI Agents should be serverless