Generates text embeddings using Ollama (local), OpenAI, or Gemini embedding APIs.
Usage
get_api_embeddings(
texts,
provider = c("ollama", "openai", "gemini"),
model = NULL,
api_key = NULL,
batch_size = 100
)
Arguments
- texts
Character vector of texts to embed
- provider
Character string: "ollama" (local API, free), "openai", or "gemini"
- model
Character string specifying the embedding model. Defaults:
ollama: "nomic-embed-text"
openai: "text-embedding-3-small"
gemini: "text-embedding-004"
- api_key
Character string with API key (not required for Ollama)
- batch_size
Integer, number of texts to embed per API call (default: 100)
Value
Matrix with embeddings (rows = texts, columns = dimensions)
See also
Other ai:
call_gemini_chat(),
call_llm_api(),
call_ollama(),
call_openai_chat(),
check_ollama(),
generate_topic_content(),
get_best_embeddings(),
get_content_type_prompt(),
get_content_type_user_template(),
get_recommended_ollama_model(),
list_ollama_models(),
run_rag_search()
Examples
if (FALSE) { # \dontrun{
data(SpecialEduTech)
texts <- SpecialEduTech$abstract[1:5]
# Using local Ollama API (free, no API key required)
embeddings <- get_api_embeddings(texts, provider = "ollama")
# Using OpenAI API
embeddings <- get_api_embeddings(texts, provider = "openai")
dim(embeddings)
} # }