Inference API
- POST🆕 Inference by environment
- POSTProduction deployment
- POSTDevelopment deployment
- POSTPublished deployment
- OpenAI compatible endpoints
- Wake deployment endpoints
- Deprecated endpoints
Async Inference API
Management API
- GETGet all secrets
- POSTUpsert a secret
- 🆕 Manage model environments
- 🆕 Manage chain environments
- GETGet instance types
- Get models
- DELDelete models
- Get chains
- DELDelete chains
- Get model deployments
- DELDelete model deployments
- Get chain deployments
- DELDelete chain deployments
- Promote deployment
- Update model deployment autoscaling settings
- Activate model deployment
- Deactivate model deployment
Get all chain deployments
Authorizations
You must specify the scheme 'Api-Key' in the Authorization header. For example, Authorization: Api-Key <Your_Api_Key>
Path Parameters
Response
A list of chain deployments.
A list of chain deployments
Unique identifier of the chain deployment
Time the chain deployment was created in ISO 8601 format
Unique identifier of the chain
Environment the chain deployment is deployed in
Chainlets in the chain deployment
Unique identifier of the chainlet
Name of the chainlet
Autoscaling settings for the chainlet. If null, it has not finished deploying
Minimum number of replicas
Maximum number of replicas
Timeframe of traffic considered for autoscaling decisions
Waiting period before scaling down any active replica
Number of requests per replica before scaling up
Name of the instance type the chainlet is deployed on
Number of active replicas
Status of the chainlet
BUILDING
, DEPLOYING
, DEPLOY_FAILED
, LOADING_MODEL
, ACTIVE
, UNHEALTHY
, BUILD_FAILED
, BUILD_STOPPED
, DEACTIVATING
, INACTIVE
, FAILED
, UPDATING
, SCALED_TO_ZERO
, WAKING_UP
Status of the chain deployment
BUILDING
, DEPLOYING
, DEPLOY_FAILED
, LOADING_MODEL
, ACTIVE
, UNHEALTHY
, BUILD_FAILED
, BUILD_STOPPED
, DEACTIVATING
, INACTIVE
, FAILED
, UPDATING
, SCALED_TO_ZERO
, WAKING_UP