Chainlet classes

APIs for creating user-defined Chainlets.

class truss_chains.ChainletBase

Base class for all chainlets.

Inheriting from this class adds validations to make sure subclasses adhere to the chainlet pattern and facilitates remote chainlet deployment.

Refer to the docs and this example chainlet for more guidance on how to create subclasses.

class truss_chains.ModelBase

Base class for all standalone models.

Inheriting from this class adds validations to make sure subclasses adhere to the truss model pattern.

class truss_chains.EngineBuilderLLMChainlet

method final async run_remote(llm_input)

Parameters:

NameTypeDescription
llm_inputEngineBuilderLLMInputOpenAI compatible request.
  • Returns: AsyncIterator[str]

function truss_chains.depends

Sets a “symbolic marker” to indicate to the framework that a chainlet is a dependency of another chainlet. The return value of depends is intended to be used as a default argument in a chainlet’s __init__-method. When deploying a chain remotely, a corresponding stub to the remote is injected in its place. In run_local mode an instance of a local chainlet is injected.

Refer to the docs and this example chainlet for more guidance on how make one chainlet depend on another chainlet.

Despite the type annotation, this does not immediately provide a chainlet instance. Only when deploying remotely or using run_local a chainlet instance is provided.

Parameters:

NameTypeDefaultDescription
chainlet_clsType[ChainletBase]The chainlet class of the dependency.
retriesint1The number of times to retry the remote chainlet in case of failures (e.g. due to transient network issues). For streaming, retries are only made if the request fails before streaming any results back. Failures mid-stream not retried.
timeout_secfloat600.0Timeout for the HTTP request to this chainlet.
use_binaryboolFalseWhether to send data in binary format. This can give a parsing speedup and message size reduction (~25%) for numpy arrays. Use NumpyArrayField as a field type on pydantic models for integration and set this option to True. For simple text data, there is no significant benefit.
  • Returns: A “symbolic marker” to be used as a default argument in a chainlet’s initializer.

function truss_chains.depends_context

Sets a “symbolic marker” for injecting a context object at runtime.

Refer to the docs and this example chainlet for more guidance on the __init__-signature of chainlets.

Despite the type annotation, this does not immediately provide a context instance. Only when deploying remotely or using run_local a context instance is provided.

  • Returns: A “symbolic marker” to be used as a default argument in a chainlet’s initializer.

class truss_chains.DeploymentContext

Bases: pydantic.BaseModel

Bundles config values and resources needed to instantiate Chainlets.

The context can optionally be added as a trailing argument in a Chainlet’s __init__ method and then used to set up the chainlet (e.g. using a secret as an access token for downloading model weights).

Parameters:

NameTypeDefaultDescription
chainlet_to_serviceMapping[str,DeployedServiceDescriptor]A mapping from chainlet names to service descriptors. This is used to create RPC sessions to dependency chainlets. It contains only the chainlet services that are dependencies of the current chainlet.
secretsMapping[str,str]A mapping from secret names to secret values. It contains only the secrets that are listed in remote_config.assets.secret_keys of the current chainlet.
data_dirPath|NoneNoneThe directory where the chainlet can store and access data, e.g. for downloading model weights.
environmentEnvironment|NoneNoneThe environment that the chainlet is deployed in. None if the chainlet is not associated with an environment.

method get_baseten_api_key()

  • Returns: str

method get_service_descriptor(chainlet_name)

Parameters:

NameTypeDescription
chainlet_namestrThe name of the chainlet.

class truss_chains.Environment

Bases: pydantic.BaseModel

The environment the chainlet is deployed in.

  • Parameters: name (str) – The name of the environment.

class truss_chains.ChainletOptions

Bases: pydantic.BaseModel

Parameters:

NameTypeDefaultDescription
enable_b10_tracingboolFalseenables baseten-internal trace data collection. This helps baseten engineers better analyze chain performance in case of issues. It is independent of a potentially user-configured tracing instrumentation. Turning this on, could add performance overhead.
enable_debug_logsboolFalseSets log level to debug in deployed server.
env_variablesMapping[str,str]{}static environment variables available to the deployed chainlet.
health_checksHealthCheckstruss.base.truss_config.HealthChecks()Configures health checks for the chainlet. See guide.
metadataJsonValue|NoneNoneArbitrary JSON object to describe chainlet.

class truss_chains.RPCOptions

Bases: pydantic.BaseModel

Options to customize RPCs to dependency chainlets.

Parameters:

NameTypeDefaultDescription
retriesint1The number of times to retry the remote chainlet in case of failures (e.g. due to transient network issues). For streaming, retries are only made if the request fails before streaming any results back. Failures mid-stream not retried.
timeout_secfloat600.0Timeout for the HTTP request to this chainlet.
use_binaryboolFalseWhether to send data in binary format. This can give a parsing speedup and message size reduction (~25%) for numpy arrays. Use NumpyArrayField as a field type on pydantic models for integration and set this option to True. For simple text data, there is no significant benefit.

function truss_chains.mark_entrypoint

Decorator to mark a chainlet as the entrypoint of a chain.

This decorator can be applied to one chainlet in a source file and then the CLI push command simplifies: only the file, not the class within, must be specified.

Optionally a display name for the Chain (not the Chainlet) can be set (effectively giving a custom default value for the name arg of the CLI push command).

Example usage:

import truss_chains as chains

@chains.mark_entrypoint
class MyChainlet(ChainletBase):
    ...

# OR with custom Chain name.
@chains.mark_entrypoint("My Chain Name")
class MyChainlet(ChainletBase):
    ...

Remote Configuration

These data structures specify for each chainlet how it gets deployed remotely, e.g. dependencies and compute resources.

class truss_chains.RemoteConfig

Bases: pydantic.BaseModel

Bundles config values needed to deploy a chainlet remotely.

This is specified as a class variable for each chainlet class, e.g.:

import truss_chains as chains


class MyChainlet(chains.ChainletBase):
    remote_config = chains.RemoteConfig(
        docker_image=chains.DockerImage(
            pip_requirements=["torch==2.0.1", ...]
        ),
        compute=chains.Compute(cpu_count=2, gpu="A10G", ...),
        assets=chains.Assets(secret_keys=["hf_access_token"], ...),
    )

Parameters:

NameTypeDefault
docker_imageDockerImagetruss_chains.DockerImage()
computeComputetruss_chains.Compute()
assetsAssetstruss_chains.Assets()
namestr|NoneNone
optionsChainletOptionstruss_chains.ChainletOptions()

class truss_chains.DockerImage

Bases: pydantic.BaseModel

Configures the docker image in which a remoted chainlet is deployed.

Any paths are relative to the source file where DockerImage is defined and must be created with the helper function [make_abs_path_here] (#function-truss-chains-make-abs-path-here). This allows you for example organize chainlets in different (potentially nested) modules and keep their requirement files right next their python source files.

Parameters:

NameTypeDefaultDescription
base_imageBasetenImage|CustomImagetruss_chains.BasetenImage()The base image used by the chainlet. Other dependencies and assets are included as additional layers on top of that image. You can choose a Baseten default image for a supported python version (e.g. BasetenImage.PY311), this will also include GPU drivers if needed, or provide a custom image (e.g. CustomImage(image="python:3.11-slim")).
pip_requirements_fileAbsPath|NoneNonePath to a file containing pip requirements. The file content is naively concatenated with pip_requirements.
pip_requirementslist[str][]A list of pip requirements to install. The items are naively concatenated with the content of the pip_requirements_file.
apt_requirementslist[str][]A list of apt requirements to install.
data_dirAbsPath|NoneNoneData from this directory is copied into the docker image and accessible to the remote chainlet at runtime.
external_package_dirslist[AbsPath]|NoneNoneA list of directories containing additional python packages outside the chain’s workspace dir, e.g. a shared library. This code is copied into the docker image and importable at runtime.

class truss_chains.BasetenImage

Bases: Enum

Default images, curated by baseten, for different python versions. If a Chainlet uses GPUs, drivers will be included in the image.

Enum MemberValue
PY39py39
PY310py310
PY311 py311

class truss_chains.CustomImage

Bases: pydantic.BaseModel

Configures the usage of a custom image hosted on dockerhub.

Parameters:

NameTypeDefaultDescription
imagestrReference to image on dockerhub.
python_executable_pathstr|NoneNoneAbsolute path to python executable (if default python is ambiguous).
docker_authDockerAuthSettings|NoneNoneSee corresponding truss config.

class truss_chains.Compute

Specifies which compute resources a chainlet has in the remote deployment.

Not all combinations can be exactly satisfied by available hardware, in some cases more powerful machine types are chosen to make sure requirements are met or over-provisioned. Refer to the baseten instance reference.

Parameters:

NameTypeDefaultDescription
cpu_countint1Minimum number of CPUs to allocate.
memorystr'2Gi'Minimum memory to allocate, e.g. “2Gi” (2 gibibytes).
gpustr|Accelerator|NoneNoneGPU accelerator type, e.g. “A10G”, “A100”, refer to the truss config for more choices.
gpu_countint1Number of GPUs to allocate.
predict_concurrencyint|Literal[‘cpu_count’]1Number of concurrent requests a single replica of a deployed chainlet handles.

Concurrency concepts are explained in this guide. It is important to understand the difference between predict_concurrency and the concurrency target (used for autoscaling, i.e. adding or removing replicas). Furthermore, the predict_concurrency of a single instance is implemented in two ways:

  • Via python’s asyncio, if run_remote is an async def. This requires that run_remote yields to the event loop.
  • With a threadpool if it’s a synchronous function. This requires that the threads don’t have significant CPU load (due to the GIL).

class truss_chains.Assets

Specifies which assets a chainlet can access in the remote deployment.

For example, model weight caching can be used like this:

import truss_chains as chains
from truss.base import truss_config

mistral_cache = truss_config.ModelRepo(
    repo_id="mistralai/Mistral-7B-Instruct-v0.2",
    allow_patterns=["*.json", "*.safetensors", ".model"]
)
chains.Assets(cached=[mistral_cache], ...)

Parameters:

NameTypeDefaultDescription
cachedIterable[ModelRepo]()One or more truss_config.ModelRepo objects.
secret_keysIterable[str]()Names of secrets stored on baseten, that the chainlet should have access to. You can manage secrets on baseten here.
external_dataIterable[ExternalDataItem]()Data to be downloaded from public URLs and made available in the deployment (via context.data_dir).

Core

General framework and helper functions.

function truss_chains.push

Deploys a chain remotely (with all dependent chainlets).

Parameters:

NameTypeDefaultDescription
entrypointType[ChainletT]The chainlet class that serves as the entrypoint to the chain.
chain_namestrThe name of the chain.
publishboolTrueWhether to publish the chain as a published deployment (it is a draft deployment otherwise)
promoteboolTrueWhether to promote the chain to be the production deployment (this implies publishing as well).
only_generate_trussesboolFalseUsed for debugging purposes. If set to True, only the the underlying truss models for the chainlets are generated in /tmp/.chains_generated.
remotestr'baseten'name of a remote config in .trussrc. If not provided, it will be inquired.
environmentstr|NoneNoneThe name of an environment to promote deployment into.
progress_barType[progress.Progress]|NoneNoneOptional rich.progress.Progress if output is desired.
include_git_infoboolFalseWhether to attach git versioning info (sha, branch, tag) to deployments made from within a git repo. If set to True in .trussrc, it will always be attached.
  • Returns: ChainService: A chain service handle to the deployed chain.

class truss_chains.deployment.deployment_client.ChainService

Handle for a deployed chain.

A ChainService is created and returned when using push. It bundles the individual services for each chainlet in the chain, and provides utilities to query their status, invoke the entrypoint etc.

method get_info()

Queries the statuses of all chainlets in the chain.

  • Returns: List of DeployedChainlet, (name, is_entrypoint, status, logs_url) for each chainlet.

property name : str

method run_remote(json)

Invokes the entrypoint with JSON data.

Parameters:

NameTypeDescription
jsonJSON dictInput data to the entrypoint
  • Returns: The JSON response.

property run_remote_url : str

URL to invoke the entrypoint.

property status_page_url : str

Link to status page on Baseten.

function truss_chains.make_abs_path_here

Helper to specify file paths relative to the immediately calling module.

E.g. in you have a project structure like this:

root/
    chain.py
    common_requirements.text
    sub_package/
        chainlet.py
        chainlet_requirements.txt

You can now in root/sub_package/chainlet.py point to the requirements file like this:

shared = make_abs_path_here("../common_requirements.text")
specific = make_abs_path_here("chainlet_requirements.text")

This helper uses the directory of the immediately calling module as an absolute reference point for resolving the file location. Therefore, you MUST NOT wrap the instantiation of make_abs_path_here into a function (e.g. applying decorators) or use dynamic code execution.

Ok:

def foo(path: AbsPath):
    abs_path = path.abs_path


foo(make_abs_path_here("./somewhere"))

Not Ok:

def foo(path: str):
    dangerous_value = make_abs_path_here(path).abs_path


foo("./somewhere")

Parameters:

NameTypeDescription
file_pathstrAbsolute or relative path.
  • Returns: AbsPath

function truss_chains.run_local

Context manager local debug execution of a chain.

The arguments only need to be provided if the chainlets explicitly access any the corresponding fields of DeploymentContext.

Parameters:

NameTypeDefaultDescription
secretsMapping[str,str]|NoneNoneA dict of secrets keys and values to provide to the chainlets.
data_dirPath|str|NoneNonePath to a directory with data files.
chainlet_to_serviceMapping[str,DeployedServiceDescriptor]NoneA dict of chainlet names to service descriptors.

Example usage (as trailing main section in a chain file):

import os
import truss_chains as chains


class HelloWorld(chains.ChainletBase):
    ...


if __name__ == "__main__":
    with chains.run_local(
        secrets={"some_token": os.environ["SOME_TOKEN"]},
        chainlet_to_service={
            "SomeChainlet": chains.DeployedServiceDescriptor(
                name="SomeChainlet",
                display_name="SomeChainlet",
                predict_url="https://...",
                options=chains.RPCOptions(),
            )
        },
    ):
        hello_world_chain = HelloWorld()
        result = hello_world_chain.run_remote(max_value=5)

    print(result)

Refer to the local debugging guide for more details.

class truss_chains.DeployedServiceDescriptor

Bases: pydantic.BaseModel

Bundles values to establish an RPC session to a dependency chainlet, specifically with StubBase.

Parameters:

NameTypeDefault
namestr
display_namestr
optionsRPCOptions
predict_urlstr|NoneNone
internal_urlInternalURLNone

class truss_chains.StubBase

Bases: BasetenSession, ABC

Base class for stubs that invoke remote chainlets.

Extends BasetenSession with methods for data serialization, de-serialization and invoking other endpoints.

It is used internally for RPCs to dependency chainlets, but it can also be used in user-code for wrapping a deployed truss model into the Chains framework. It flexibly supports JSON and pydantic inputs and output. Example usage:

import pydantic
import truss_chains as chains


class WhisperOutput(pydantic.BaseModel):
    ...


class DeployedWhisper(chains.StubBase):
    # Input JSON, output JSON.
    async def run_remote(self, audio_b64: str) -> Any:
        return await self.predict_async(
            inputs={"audio": audio_b64})
        # resp == {"text": ..., "language": ...}

    # OR Input JSON, output pydantic model.
    async def run_remote(self, audio_b64: str) -> WhisperOutput:
        return await self.predict_async(
            inputs={"audio": audio_b64}, output_model=WhisperOutput)

    # OR Input and output are pydantic models.
    async def run_remote(self, data: WhisperInput) -> WhisperOutput:
        return await self.predict_async(data, output_model=WhisperOutput)


class MyChainlet(chains.ChainletBase):

    def __init__(self, ..., context=chains.depends_context()):
        ...
        self._whisper = DeployedWhisper.from_url(
            WHISPER_URL,
            context,
            options=chains.RPCOptions(retries=3),
        )

    async def run_remote(self, ...):
       await self._whisper.run_remote(...)

Parameters:

NameTypeDescription
service_descriptorDeployedServiceDescriptor]Contains the URL and other configuration.
api_keystrA baseten API key to authorize requests.

classmethod from_url(predict_url, context_or_api_key, options=None)

Factory method, convenient to be used in chainlet’s __init__-method.

Parameters:

NameTypeDescription
predict_urlstrURL to predict endpoint of another chain / truss model.
context_or_api_keyDeploymentContextDeployment context object, obtained in the chainlet’s __init__ or Baseten API key.
optionsRPCOptionsRPC options, e.g. retries.

Invocation Methods

  • async predict_async(inputs: PydanticModel, output_model: Type[PydanticModel]) → PydanticModel
  • async predict_async(inputs: JSON, output_model: Type[PydanticModel]) → PydanticModel
  • async predict_async(inputs: JSON) → JSON
  • async predict_async_stream(inputs: PydanticModel | JSON) -> AsyncIterator[bytes]

Deprecated synchronous methods:

  • predict_sync(inputs: PydanticModel, output_model: Type[PydanticModel]) → PydanticModel
  • predict_sync(inputs: JSON, output_model: Type[PydanticModel]) → PydanticModel
  • predict_sync(inputs: JSON) → JSON

class truss_chains.RemoteErrorDetail

Bases: pydantic.BaseModel

When a remote chainlet raises an exception, this pydantic model contains information about the error and stack trace and is included in JSON form in the error response.

Parameters:

NameType
exception_cls_namestr
exception_module_namestr|None
exception_messagestr
user_stack_tracelist[StackFrame]

method format()

Format the error for printing, similar to how Python formats exceptions with stack traces.

  • Returns: str

class truss_chains.GenericRemoteException

Bases: Exception

Raised when calling a remote chainlet results in an error and it is not possible to re-raise the same exception that was raise remotely in the caller.