Returns a recommended Ollama model based on what's available.
Usage
get_recommended_ollama_model(
preferred_models = c("phi3:mini", "llama3.1:8b", "mistral:7b", "tinyllama"),
verbose = FALSE
)
Arguments
- preferred_models
Character vector of preferred models in priority order.
- verbose
Logical, if TRUE, prints status messages.
Value
Character string of recommended model, or NULL if none available.
See also
Other ai:
call_gemini_chat(),
call_llm_api(),
call_ollama(),
call_openai_chat(),
check_ollama(),
generate_topic_content(),
get_api_embeddings(),
get_best_embeddings(),
get_content_type_prompt(),
get_content_type_user_template(),
list_ollama_models(),
run_rag_search()
Examples
if (FALSE) { # \dontrun{
model <- get_recommended_ollama_model()
print(model)
} # }