BaseTen
Search…
Interacting with models
We provide three ways to engage with a deployed model. One way is via our worklet engine. Another way is to call the model directly via a REST endpoint. One may also use the BaseTen client to run inference from the model. The following examples will illustrate how to call the model via all paths.

Type of inputs

Model inputs are expected to be JSON serializable into a Python list type.

Motivating Example; Custom Model

Assume there is a custom model defined by the following code. It will be a very simple model that given a list of dictionary inputs, will return the sum of the keys 'a', 'b', and 'c' for each item.
1
class InteractionModel:
2
3
def load(self):
4
pass
5
6
def predict(self, inputs: list) -> dict:
7
results = []
8
for model_input in inputs:
9
model_result = model_input['a'] + model_input['b'] + model_input['c']
10
results.append(model_result)
11
return {
12
'predictions': results
13
}
Copied!

REST API

Once this model is deployed on BaseTen infrastructure to call it directly via the REST API one would execute the following to call the model.
1
$ curl -X POST https://app.baseten.co/model_versions/<model_version_id>/predict
2
-H 'Authorization: Api-Key <api-key>'
3
-d '{"inputs": [{"a": 1, "b": 2, "c": 3}]}'
4
5
# Response
6
{
7
"model_id": <model_id>,
8
"model_version_id": <model_version_id>,
9
"predictions": [
10
{
11
"predictions": [
12
6
13
]
14
}
15
]
16
}
17
Copied!

Worklet Graph

In an application generated on BaseTen create a Model node referencing the deployed model and a Code node preceding it.
Then in the Code node implement the following Python.
1
def model_input(node_input, env, context):
2
return [{'a': 1, 'b': 2, 'c': 3}]
Copied!
Once saved entire application can be run to see the following output.

BaseTen Client

With a deployed model in BaseTen infrastructure one can query with the BaseTen client package in Python as so.
1
model = baseten.deployed_model_version_id(<model_version_id>)
2
model.predict([{"a": 1, "b": 2, "c": 3}])
3
# ->
4
[{'predictions': [6]}]
Copied!

Motivating Example: Iris RFC

In this example we will demonstrate interacting with the Iris Random Forest Classifier that exists within the BaseTen Model Zoo. The Iris model expects a list of lists; with dimensions Nx4 and features that correspond to sepal length, sepal width, petal length, and pedal width. But for this example, we are focusing on structure more than information. So the input will be simple, like [[1, 2, 3, 4]]

REST API

Here we will call the model via the REST endpoint and cURL.
1
$ curl -X POST https://app.baseten.co/model_versions/<model_version_id>/predict
2
-H 'Authorization: Api-Key <api-key>'
3
-d '{"inputs": [1, 2, 3, 4]]}'
4
5
# Response
6
{
7
"model_id": <model_id>,
8
"model_version_id": <model_version_id>,
9
"predictions": [
10
2
11
],
12
"probabilities": [
13
[
14
0,
15
0.37,
16
0.63
17
]
18
]
19
}
20
Copied!

Worklet Graph

The sample demo application for Iris RFC demonstrates the call for a 10 x 4 input.
1
import numpy as np
2
3
NUMBER_OF_IRISES = 10
4
SEPAL_LENGTH_MEAN = 5.84
5
SEPAL_WIDTH_MEAN = 3.06
6
PETAL_LENGTH_MEAN = 3.76
7
PETAL_WIDTH_MEAN = 1.20
8
SEPAL_LENGTH_STD = 0.83
9
SEPAL_WIDTH_STD = 0.44
10
PETAL_LENGTH_STD = 1.77
11
PETAL_WIDTH_STD = 0.76
12
13
def generate_random_irises(number_of_irises):
14
random_irises = np.array([
15
np.random.randn(number_of_irises)*SEPAL_LENGTH_STD+SEPAL_LENGTH_MEAN,
16
np.random.randn(number_of_irises)*SEPAL_WIDTH_STD+SEPAL_WIDTH_MEAN,
17
np.random.randn(number_of_irises)*PETAL_LENGTH_STD+PETAL_LENGTH_MEAN,
18
np.random.randn(number_of_irises)*PETAL_WIDTH_STD+PETAL_WIDTH_MEAN,
19
]).T.tolist()
20
return random_irises
21
22
def process_input(node_input, env, context):
23
inputs = node_input.get('inputs')
24
if not inputs:
25
number_of_irises = node_input.get('number_of_irises', NUMBER_OF_IRISES)
26
inputs = generate_random_irises(number_of_irises)
27
env['model_input'] = inputs
28
return inputs
Copied!
The output of the worklet produces the classes for each 10 examples

BaseTen Client

Here we will query the Iris model with two inputs.
1
import baseten
2
model = baseten.deployed_model_version_id(<model_version_id>)
3
model.predict([[1, 2, 3, 4], [5, 6, 7, 8]])
4
# ->
5
[2, 2]
Copied!

Last modified 1mo ago