Nomic
Nomic Embed v1.5
SOTA text embedding model with variable dimensionality — outperforms OpenAI text-embedding-ada-002 and text-embedding-3-small models.
Deploy Nomic Embed v1.5
Example usage
Nomic Embed v1.5 is a state of the art text embedding model with two special features:
- You can choose whether to optimize the embeddings for retrieval, search, clustering, or classification.
- You can trade off between cost and accuracy by choosing your own dimensionality thanks to Matryoshka Representation Learning.
Nomic Embed v1.5 takes the following parameters:
texts
the strings to embed.task_type
the task to optimize the embedding for. Can besearch_document
(default),search_query
,clustering
, orclassification
.dimensionality
the size of each output vector, any integer between64
and768
(default).
This code sample demonstrates embedding a set of sentences for retrieval with a dimensionality of 512.