Chainlet classes

APIs for creating user-defined Chainlets.

class truss_chains.ChainletBase

Base class for all chainlets.

Inheriting from this class adds validations to make sure subclasses adhere to the chainlet pattern and facilitates remote chainlet deployment.

Refer to the docs and this example chainlet for more guidance on how to create subclasses.

truss_chains.depends

Sets a “symbolic marker” to indicate to the framework that a chainlet is a dependency of another chainlet. The return value of depends is intended to be used as a default argument in a chainlet’s __init__-method. When deploying a chain remotely, a corresponding stub to the remote is injected in its place. In run_local mode an instance of a local chainlet is injected.

Refer to the docs and this example chainlet for more guidance on how make one chainlet depend on another chainlet.

Despite the type annotation, this does not immediately provide a chainlet instance. Only when deploying remotely or using run_local a chainlet instance is provided.

Parameters:

NameTypeDescription
chainlet_clsType[ChainletBase]The chainlet class of the dependency.
retriesintThe number of times to retry the remote chainlet in case of failures (e.g. due to transient network issues).
timeout_secintTimeout for the HTTP request to this chainlet.
  • Returns: A “symbolic marker” to be used as a default argument in a chainlet’s initializer.

truss_chains.depends_context

Sets a “symbolic marker” for injecting a context object at runtime.

Refer to the docs and this example chainlet for more guidance on the __init__-signature of chainlets.

Despite the type annotation, this does not immediately provide a context instance. Only when deploying remotely or using run_local a context instance is provided.

  • Returns: A “symbolic marker” to be used as a default argument in a chainlet’s initializer.

class truss_chains.DeploymentContext(Generic[UserConfigT])

Bases: pydantic.BaseModel

Bundles config values and resources needed to instantiate Chainlets.

The context can optionally added as a trailing argument in a Chainlet’s __init__ method and then used to set up the chainlet (e.g. using a secret as an access token for downloading model weights).

Parameters:

NameTypeDescription
data_dirPath|NoneThe directory where the chainlet can store and access data, e.g. for downloading model weights.
user_configUserConfigTUser-defined configuration for the chainlet.
chainlet_to_serviceMapping[str,ServiceDescriptor]A mapping from chainlet names to service descriptors. This is used create RPCs sessions to dependency chainlets. It contains only the chainlet services that are dependencies of the current chainlet.
secretsMapping[str,str]A mapping from secret names to secret values. It contains only the secrets that are listed in remote_config.assets.secret_keys of the current chainlet.
user_envMapping[str,str]These values can be provided to the deploy command and customize the behavior of deployed chainlets. E.g. for differentiating between prod and dev version of the same chain.

chainlet_to_service *:

Mapping[str, ServiceDescriptor]*

data_dir : Path | None

get_baseten_api_key()

  • Return type: str

get_service_descriptor(chainlet_name)

Parameters:

NameTypeDescription
chainlet_namestrThe name of the chainlet.

secrets : Mapping[str, str]

user_config : UserConfigT

user_env : Mapping[str, str]

class truss_chains.RPCOptions

Bases: pydantic.BaseModel

Options to customize RPCs to dependency chainlets.

Parameters:

NameTypeDescription
timeout_secint
retriesint

truss_chains.mark_entrypoint

Decorator to mark a chainlet as the entrypoint of a chain.

This decorator can be applied to one chainlet in a source file and then the CLI push command simplifies because only the file, but not the chainlet class in the file, needs to be specified.

Example usage:

import truss_chains as chains


@chains.mark_entrypoint
class MyChainlet(ChainletBase):
    ...

Parameters:

NameTypeDescription
clsType[ChainletBase]The chainlet class.
  • Return type: Type[ChainletBase]

Remote Configuration

These data structures specify for each chainlet how it gets deployed remotely, e.g. dependencies and compute resources.

class truss_chains.RemoteConfig

Bases: pydantic.BaseModel

Bundles config values needed to deploy a chainlet remotely.

This is specified as a class variable for each chainlet class, e.g.:

import truss_chains as chains


class MyChainlet(chains.ChainletBase):
    remote_config = chains.RemoteConfig(
        docker_image=chains.DockerImage(
            pip_requirements=["torch==2.0.1", ...]
        ),
        compute=chains.Compute(cpu_count=2, gpu="A10G", ...),
        assets=chains.Assets(secret_keys=["hf_access_token"], ...),
    )

Parameters:

NameTypeDescription
docker_imageDockerImage
computeCompute
assetsAssets
namestr|None

class truss_chains.DockerImage

Bases: pydantic.BaseModel

Configures the docker image in which a remoted chainlet is deployed.

Any paths are relative to the source file where DockerImage is defined and must be created with the helper function make_abs_path_here. This allows you for example organize chainlets in different (potentially nested) modules and keep their requirement files right next their python source files.

Parameters:

NameTypeDescription
base_imageBasetenImage|CustomImageThe base image used by the chainlet. Other dependencies and assets are included as additional layers on top of that image. You can choose a baseten default image for a supported python version (e.g. BasetenImage.PY311), this will also include GPU drivers if needed, or provide a custom image (e.g. CustomImage(image="python:3.11-slim")). Specification by string is deprecated.
pip_requirements_fileAbsPath|NonePath to a file containing pip requirements. The file content is naively concatenated with pip_requirements.
pip_requirementslist[str]A list of pip requirements to install. The items are naively concatenated with the content of the pip_requirements_file.
apt_requirementslist[str]A list of apt requirements to install.
data_dirAbsPath|NoneData from this directory is copied into the docker image and accessible to the remote chainlet at runtime.
external_package_dirslist[AbsPath]|NoneA list of directories containing additional python packages outside the chain’s workspace dir, e.g. a shared library. This code is copied into the docker image and importable at runtime.

class truss_chains.BasetenImage

Bases: Enum

Default images, curated by baseten, for different python versions. If a Chainlet uses GPUs, drivers will be included in the image.

PY310 = ‘py310’

PY311 = ‘py311’

PY39 = ‘py39’

class truss_chains.CustomImage

Bases: pydantic.BaseModel

Configures the usage of a custom image hosted on dockerhub.

Parameters:

NameTypeDescription
imagestrReference to image on dockerhub.
python_executable_pathstr|NoneAbsolute path to python executable (if default python is ambiguous).
docker_authDockerAuthSettings|NoneSee corresponding truss config.

class truss_chains.Compute

Specifies which compute resources a chainlet has in the remote deployment.

Not all combinations can be exactly satisfied by available hardware, in some cases more powerful machine types are chosen to make sure requirements are met or over-provisioned. Refer to the baseten instance reference.

Parameters:

NameTypeDescription
cpu_countintMinimum number of CPUs to allocate.
memorystrMinimum memory to allocate, e.g. “2Gi” (2 gibibytes).
gpustr|Accelerator|NoneGPU accelerator type, e.g. “A10G”, “A100”, refer to the truss config for more choices.
gpu_countintNumber of GPUs to allocate.
predict_concurrencyint|Literal[‘cpu_count’]Number of concurrent requests a single replica of a deployed chainlet handles.

Concurrency concepts are explained in this guide. It is important to understand the difference between predict_concurrency and the concurrency target (used for autoscaling, i.e. adding or removing replicas). Furthermore, the predict_concurrency of a single instance is implemented in two ways:

  • Via python’s asyncio, if run_remote is an async def. This requires that run_remote yields to the event loop.
  • With a threadpool if it’s a synchronous function. This requires that the threads don’t have significant CPU load (due to the GIL).

class truss_chains.Assets

Specifies which assets a chainlet can access in the remote deployment.

For example, model weight caching can be used like this:

import truss_chains as chains
from truss import truss_config

mistral_cache = truss_config.ModelRepo(
    repo_id="mistralai/Mistral-7B-Instruct-v0.2",
    allow_patterns=["*.json", "*.safetensors", ".model"]
  )
chains.Assets(cached=[mistral_cache], ...)

See truss caching guide for more details on caching.

Parameters:

NameTypeDescription
cachedIterable[ModelRepo]One or more truss_config.ModelRepo objects.
secret_keysIterable[str]Names of secrets stored on baseten, that the chainlet should have access to. You can manage secrets on baseten here.
external_dataIterable[ExternalDataItem]Data to be downloaded from public URLs and made available in the deployment (via context.data_dir). See here for more details.

Core

General framework and helper functions.

truss_chains.push

Deploys a chain remotely (with all dependent chainlets).

Parameters:

NameTypeDescription
entrypointType[ChainletBase]The chainlet class that serves as the entrypoint to the chain.
chain_namestrThe name of the chain.
publishboolWhether to publish the chain as a published deployment (it is a draft deployment otherwise)
promoteboolWhether to promote the chain to be the production deployment (this implies publishing as well).
user_envMapping[str,str]|NoneThese values can be provided to the push command and customize the behavior of deployed chainlets. E.g. for differentiating between prod and dev version of the same chain.
only_generate_trussesboolUsed for debugging purposes. If set to True, only the the underlying truss models for the chainlets are generated in /tmp/.chains_generated.
remotestr|Nonename of a remote config in .trussrc. If not provided, it will be inquired.
  • Returns: A chain service handle to the deployed chain.
  • Return type: ChainService

truss_chains.deploy_remotely

Deprecated, use push instead.

class truss_chains.remote.ChainService

Handle for a deployed chain.

A ChainService is created and returned when using push. It bundles the individual services for each chainlet in the chain, and provides utilities to query their status, invoke the entrypoint etc.

get_info()

Queries the statuses of all chainlets in the chain.

  • Returns: List of DeployedChainlet, (name, is_entrypoint, status, logs_url) for each chainlet.
  • Return type: list[DeployedChainlet]

property name : str

run_remote(json)

Invokes the entrypoint with JSON data.

Parameters:

NameTypeDescription
jsonJSON dictInput data to the entrypoint
  • Returns: The JSON response.
  • Return type: Any

property run_remote_url : str

URL to invoke the entrypoint.

property status_page_url : str

Link to status page on Baseten.

truss_chains.make_abs_path_here

Helper to specify file paths relative to the immediately calling module.

E.g. in you have a project structure like this:

root/
    chain.py
    common_requirements.text
    sub_package/
        chainlet.py
        chainlet_requirements.txt

You can now in root/sub_package/chainlet.py point to the requirements file like this:

shared = make_abs_path_here("../common_requirements.text")
specific = make_abs_path_here("chainlet_requirements.text")

This helper uses the directory of the immediately calling module as an absolute reference point for resolving the file location. Therefore, you MUST NOT wrap the instantiation of make_abs_path_here into a function (e.g. applying decorators) or use dynamic code execution.

Ok:

def foo(path: AbsPath):
    abs_path = path.abs_path


foo(make_abs_path_here("./somewhere"))

Not Ok:

def foo(path: str):
    dangerous_value = make_abs_path_here(path).abs_path


foo("./somewhere")

Parameters:

NameTypeDescription
file_pathstrAbsolute or relative path.
  • Return type: AbsPath

truss_chains.run_local

Context manager local debug execution of a chain.

The arguments only need to be provided if the chainlets explicitly access any the corresponding fields of DeploymentContext.

Parameters:

NameTypeDescription
secretsMapping[str,str]|NoneA dict of secrets keys and values to provide to the chainlets.
data_dirPath|str|NonePath to a directory with data files.
chainlet_to_serviceMapping[str,ServiceDescriptorA dict of chainlet names to service descriptors.
user_envMapping[str,str]|Nonesee push.
  • Return type: ContextManager[None]

Example usage (as trailing main section in a chain file):

import os
import truss_chains as chains


class HelloWorld(chains.ChainletBase):
    ...


if __name__ == "__main__":
    with chains.run_local(
        secrets={"some_token": os.environ["SOME_TOKEN"]},
        chainlet_to_service={
            "SomeChainlet": chains.ServiceDescriptor(
                name="SomeChainlet",
                predict_url="https://...",
                options=chains.RPCOptions(),
            )
        },
    ):
        hello_world_chain = HelloWorld()
        result = hello_world_chain.run_remote(max_value=5)

    print(result)

Refer to the local debugging guide for more details.

class truss_chains.ServiceDescriptor

Bases: pydantic.BaseModel

Bundles values to establish an RPC session to a dependency chainlet, specifically with StubBase.

Parameters:

NameTypeDescription
namestr
predict_urlstr
optionsRPCOptions

class truss_chains.StubBase

Base class for stubs that invoke remote chainlets.

It is used internally for RPCs to dependency chainlets, but it can also be used in user-code for wrapping a deployed truss model into the chains framework, e.g. like that:

import pydantic
import truss_chains as chains


class WhisperOutput(pydantic.BaseModel):
    ...


class DeployedWhisper(chains.StubBase):

    async def run_remote(self, audio_b64: str) -> WhisperOutput:
        resp = await self._remote.predict_async(
            json_payload={"audio": audio_b64})
        return WhisperOutput(text=resp["text"], language=resp["language"])


class MyChainlet(chains.ChainletBase):

    def __init__(self, ..., context=chains.depends_context()):
        ...
        self._whisper = DeployedWhisper.from_url(
            WHISPER_URL,
            context,
            options=chains.RPCOptions(retries=3),
        )

Parameters:

NameTypeDescription
service_descriptorServiceDescriptorContains the URL and other configuration.
api_keystrA baseten API key to authorize requests.

classmethod from_url(predict_url, context, options=None)

Factory method, convenient to be used in chainlet’s __init__-method.

Parameters:

NameTypeDescription
predict_urlstrURL to predict endpoint of another chain / truss model.
contextDeploymentContextDeployment context object, obtained in the chainlet’s __init__.
optionsRPCOptionsRPC options, e.g. retries.

class truss_chains.RemoteErrorDetail

Bases: pydantic.BaseModel

When a remote chainlet raises an exception, this pydantic model contains information about the error and stack trace and is included in JSON form in the error response.

Parameters:

NameTypeDescription
remote_namestr
exception_cls_namestr
exception_module_namestr|None
exception_messagestr
user_stack_tracelist[StackFrame]

format()

Format the error for printing, similar to how Python formats exceptions with stack traces.

  • Return type: str