Skip to main content
Version: 0.0.0

Embedding: thenlper/gte-large

info

See the the Hugging Face model page for more model details.

See Query a model for how to use the embedding model with Anyscale Endpoints.

About this model

Model name to use in API calls:

thenlper/gte-large

The GTE models completed training on a large-scale corpus of relevance text pairs, covering a wide range of domains and scenarios. This enables the GTE models to be useful for various downstream tasks of text embeddings, including information retrieval, semantic textual similarity, text reranking, etc.

Model Developers: Hugging Face

Input Models: input text only.

Output Models: embedding of the text only.

Maximum Input Length: 512

License: MIT