Baseten home page
Search...
⌘K
Get started
Overview
Quick start
Concepts
Why Baseten
How Baseten works
Development
Concepts
Model APIs
Developing a model
Developing a Chain
Deployment
Concepts
Deployments
Environments
Resources
Autoscaling
Inference
Concepts
Call your model
Streaming
Async inference
Structured LLM output
Output formats
Integrations
Training
Overview
Getting started
Concepts
Management
Deploying checkpoints
Observability
Metrics
Status and health
Security
Exporting metrics
Tracing
Billing and usage
Troubleshooting
Deployments
Inference
Support
Return to Baseten
Baseten home page
Search...
⌘K
Ask AI
Support
Return to Baseten
Return to Baseten
Search...
Navigation
Inference
Integrations
Documentation
Examples
Reference
Status
Documentation
Examples
Reference
Status
Inference
Integrations
Integrate your models with tools like LangChain, LiteLLM, and more.
Chainlit
Build your own open-source ChatGPT with Baseten and Chainlit.
LangChain
Use your Baseten models in the LangChain ecosystem.
LiteLLM
Use your Baseten models in LiteLLM projects.
LiveKit
Build real-time voice agents with TTS models hosted on Baseten.
Build your own
Want to integrate Baseten with your platform or project? Reach out to
support@baseten.co
and we’ll help with building and marketing the integration.
Was this page helpful?
Yes
No
Previous
Overview
An introduction to Baseten Training for streamlining and managing the model training lifecycle.
Next
Assistant
Responses are generated using AI and may contain mistakes.