Skip to main content

Ollama

LiteLLM supports all models from Ollama

Ollama Models

Ollama supported models: https://github.com/jmorganca/ollama

Model NameFunction CallRequired OS Variables
Llama2 7Bcompletion(model='llama2', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Llama2 13Bcompletion(model='llama2:13b', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Llama2 70Bcompletion(model='llama2:70b', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Llama2 Uncensoredcompletion(model='llama2-uncensored', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Orca Minicompletion(model='orca-mini', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Vicunacompletion(model='vicuna', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Nous-Hermescompletion(model='nous-hermes', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Nous-Hermes 13Bcompletion(model='nous-hermes:13b', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required
Wizard Vicuna Uncensoredcompletion(model='wizard-vicuna', messages, api_base="http://localhost:11434", custom_llm_provider="ollama", stream=True)No API Key required