lm.embed
The lm.embed tool generates a dense vector embedding for a text string via OpenRouter. Embeddings can be used for semantic search, similarity comparison, or as input to downstream ML models.
Capability Required
tool.invoke:lm.embed
Input Schema
{
"type": "object",
"required": ["text"],
"properties": {
"text": {
"type": "string",
"description": "The text to embed."
},
"model": {
"type": "string",
"description": "Embedding model ID (optional)."
}
}
}
Output Schema
{
"type": "object",
"properties": {
"embedding": {
"type": "array",
"items": { "type": "number" },
"description": "The dense vector embedding."
},
"dimensions": {
"type": "integer",
"description": "Number of dimensions in the embedding."
},
"cost": {
"type": "number",
"description": "Approximate USD cost."
}
}
}
Examples
#![allow(unused)] fn main() { let result = agent.invoke_tool("lm.embed", json!({ "text": "Renewable energy sources include solar and wind power.", "model": "openai/text-embedding-3-small" })).await?; let embedding: Vec<f64> = serde_json::from_value(result["embedding"].clone())?; println!("Dimensions: {}", result["dimensions"]); // e.g. 1536 }
Use Cases
- Semantic search: Embed documents and queries, compute cosine similarity
- Clustering: Group similar observations or memory entries
- RAG (Retrieval-Augmented Generation): Embed documents at ingestion, retrieve at query time
- Deduplication: Find near-duplicate content by embedding distance
Cost
Estimated cost: 0.1 (actual cost varies by model and text length).
Network Policy
Requires spec.network.policy: full or allowlist with openrouter.ai:443.