Gemini Embedding
Use Google’s Gemini models to generate high-performance text embeddings for vector databases.
EmbeddingModels represent services that generate vector representations of text data. In the MCP Toolbox, these models enable Semantic Queries, allowing Tools to automatically convert human-readable text into numerical vectors before using them in a query.
This is primarily used in two scenarios:
Vector Ingestion: Converting a text parameter into a vector string during
an INSERT operation.
Semantic Search: Converting a natural language query into a vector to perform similarity searches.
When building tools for vector ingestion, you often need the same input string twice:
Requesting an Agent (LLM) to output the exact same string twice is inefficient
and error-prone. The valueFromParam field solves this by allowing a parameter
to inherit its value from another parameter in the same tool.
The following configuration defines an embedding model and applies it to specific tool parameters.
Tip
Use environment variable replacement with the format ${ENV_NAME} instead of hardcoding your API keys into the configuration file.
Define an embedding model in the embeddingModels section:
embeddingModels:
gemini-model: # Name of the embedding model
kind: gemini
model: gemini-embedding-001
apiKey: ${GOOGLE_API_KEY}
dimension: 768
Use the defined embedding model, embed your query parameters using the
embeddedBy field. Only string-typed parameters can be embedded:
tools:
# Vector ingestion tool
insert_embedding:
kind: postgres-sql
source: my-pg-instance
statement: |
INSERT INTO documents (content, embedding)
VALUES ($1, $2);
parameters:
- name: content
type: string
description: The raw text content to be stored in the database.
- name: vector_string
type: string
# This parameter is hidden from the LLM.
# It automatically copies the value from 'content' and embeds it.
valueFromParam: content
embeddedBy: gemini-model
# Semantic search tool
search_embedding:
kind: postgres-sql
source: my-pg-instance
statement: |
SELECT id, content, embedding <-> $1 AS distance
FROM documents
ORDER BY distance LIMIT 1
parameters:
- name: semantic_search_string
type: string
description: The search query that will be converted to a vector.
embeddedBy: gemini-model # refers to the name of a defined embedding model
Use Google’s Gemini models to generate high-performance text embeddings for vector databases.