Caching objects during the build stage is advantageous, especially when cold starts are concerned. While model_cache and external_data work well in many scenarios, you may find yourself needing advanced features to control the type of data and the location where it gets cached.

The build_commands feature does just that. It allows you to run custom docker commands at build time.

To give a few examples, you can clone Github repositories, download models, and even create directories during the build stage!

Using run commands during the docker build

Build commands is accessible in Truss via the config.yaml file.

build_commands:
- git clone https://github.com/comfyanonymous/ComfyUI.git
- cd ComfyUI && git checkout b1fd26fe9e55163f780bf9e5f56bf9bf5f035c93 && pip install -r requirements.txt
model_name: Build Commands Demo
python_version: py310
resources:
  accelerator: A100
  use_gpu: true

In the example above a git repository is being cloned and a set of python requirements is installed. All of this happens during the container build step so that the Github repository and the Python packages will be loaded from cache during deployment.

Creating directories in your truss

Sometimes your truss relies on a large codebase. You can now add files or directories to this codebase directly through build commands.

build_commands:
- git clone https://github.com/comfyanonymous/ComfyUI.git
- cd ComfyUI && mkdir ipadapter
- cd ComfyUI && mkdir instantid
model_name: Build Commands Demo
python_version: py310
resources:
  accelerator: A100
  use_gpu: true

Yet another way to cache your model weights

Best practices for caching model weights

While you can use the build_commands feature to cache model weights, it should be used to cache weights under 10 GB. To cache larger model weights, the model_cache and external_data features offer more robust capabilites.

If you’re familiar with the Linux/Unix OS, you may have used the wget tool to download files. Build commands allow you to use wget to download model weights and store them wherever you like in the truss. Here’s an example:

build_commands:
- git clone https://github.com/comfyanonymous/ComfyUI.git
- cd ComfyUI && pip install -r requirements.txt
- cd ComfyUI/custom_nodes && git clone https://github.com/Fannovel16/comfyui_controlnet_aux --recursive && cd comfyui_controlnet_aux && pip install -r requirements.txt
- cd ComfyUI/models/controlnet && wget -O control-lora-canny-rank256.safetensors https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-canny-rank256.safetensors
- cd ComfyUI/models/controlnet && wget -O control-lora-depth-rank256.safetensors https://huggingface.co/stabilityai/control-lora/resolve/main/control-LoRAs-rank256/control-lora-depth-rank256.safetensors
- cd ComfyUI/models/checkpoints && wget -O dreamshaperXL_v21TurboDPMSDE.safetensors https://civitai.com/api/download/models/351306
- cd ComfyUI/models/loras && wget -O StudioGhibli.Redmond-StdGBRRedmAF-StudioGhibli.safetensors https://huggingface.co/artificialguybr/StudioGhibli.Redmond-V2/resolve/main/StudioGhibli.Redmond-StdGBRRedmAF-StudioGhibli.safetensors
model_name: Build Commands Demo
python_version: py310
resources:
  accelerator: A100
  use_gpu: true
system_packages:
  - wget

Using build_commands you can run any kind of shell command that you would normally run locally. The main benefit you get is that everything you run gets cached, which helps reduce the cold-start time for your model.