Auto-detects and uses the best available embedding provider with the following priority:
Ollama (free, local, fast) - if running
sentence-transformers (local Python) - if Python environment is set up
OpenAI API - if OPENAI_API_KEY is set
Gemini API - if GEMINI_API_KEY is set
Usage
get_best_embeddings(texts, provider = "auto", model = NULL, verbose = TRUE)
Arguments
- texts
Character vector of texts to embed
- provider
Character string: "auto" (default), "ollama", "sentence-transformers",
"openai", or "gemini". Use "auto" for automatic detection.
- model
Character string specifying the embedding model. If NULL, uses default
model for the selected provider.
- verbose
Logical, whether to print progress messages (default: TRUE)
Value
Matrix with embeddings (rows = texts, columns = dimensions)
See also
Other ai:
call_gemini_chat(),
call_llm_api(),
call_ollama(),
call_openai_chat(),
check_ollama(),
generate_topic_content(),
get_api_embeddings(),
get_content_type_prompt(),
get_content_type_user_template(),
get_recommended_ollama_model(),
list_ollama_models(),
run_rag_search()
Examples
if (FALSE) { # \dontrun{
data(SpecialEduTech)
texts <- SpecialEduTech$abstract[1:5]
# Auto-detect best available provider
embeddings <- get_best_embeddings(texts)
# Force specific provider
embeddings <- get_best_embeddings(texts, provider = "ollama")
dim(embeddings)
} # }