Baseten home pagelight logodark logo
Get started
  • Overview
  • Quick start
Concepts
  • Why Baseten
  • How Baseten works
Development
  • Concepts
  • Model APIs
  • Developing a model
  • Developing a Chain
Deployment
  • Concepts
  • Deployments
  • Environments
  • Resources
  • Autoscaling
Inference
  • Concepts
  • Call your model
  • Streaming
  • Async inference
  • Structured LLM output
  • Output formats
  • Integrations
Training
  • Overview
  • Getting started
  • Concepts
  • Management
  • Deploying checkpoints
Observability
  • Metrics
  • Status and health
  • Security
  • Exporting metrics
  • Tracing
  • Billing and usage
Troubleshooting
  • Deployments
  • Inference
  • Support
  • Return to Baseten
Baseten home pagelight logodark logo
  • Support
  • Return to Baseten
  • Return to Baseten
Documentation
Examples
Reference
Status
Documentation
Examples
Reference
Status
Inference

Integrations

Integrate your models with tools like LangChain, LiteLLM, and more.

Chainlit

Build your own open-source ChatGPT with Baseten and Chainlit.

LangChain

Use your Baseten models in the LangChain ecosystem.

LiteLLM

Use your Baseten models in LiteLLM projects.

Twilio

Build an AI-powered Twilio SMS chatbot with a Baseten-hosted LLM.

Build your own

Want to integrate Baseten with your platform or project? Reach out to support@baseten.co and we’ll help with building and marketing the integration.

Was this page helpful?

Previous
OverviewAn introduction to Baseten Training for streamlining and managing the model training lifecycle.
Next
Assistant
Responses are generated using AI and may contain mistakes.
Baseten home pagelight logodark logo
githublinkedinx
Return to BasetenChangelogSupportStatus
githublinkedinx
githublinkedinx