Ollama
Installation
You will need to download and install Ollama separately: https://ollama.com/download
User ollama CLI to pull the models you would like to use. For example:
Configuration
from genkit.plugins.ollama import Ollama, ModelDefinition, EmbeddingModelDefinition
ai = Genkit(
plugins=[
Ollama(
models=[
ModelDefinition(name='gemma3'),
ModelDefinition(name='mistral-nemo'),
],
embedders=[
EmbeddingModelDefinition(
name='nomic-embed-text',
dimensions=512,
)
],
)
],
)
Then use Ollama models and embedders by specifying ollama/
prefix:
genereate_response = await ai.generate(
prompt='...',
model='ollama/gemma3',
)
embedding_response = await ai.embed(
embedder='ollama/nomic-embed-text',
documents=[Document.from_text('...')],
)
API Reference
Ollama Plugin for Genkit.
Ollama
Bases: genkit.ai.Plugin
Ollama plugin for Genkit.
Source code in plugins/ollama/src/genkit/plugins/ollama/plugin_api.py
__init__(models=None, embedders=None, server_address=None, request_headers=None)
Initialize the Ollama plugin.
Source code in plugins/ollama/src/genkit/plugins/ollama/plugin_api.py
initialize(ai)
Initialize the Ollama plugin.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ai | genkit.ai.GenkitRegistry | The AI registry to initialize the plugin with. | required |
Source code in plugins/ollama/src/genkit/plugins/ollama/plugin_api.py
ollama_name(name)
Get the name of the Ollama model.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name | str | The name of the Ollama model. | required |
Returns:
Type | Description |
---|---|
str | The name of the Ollama model. |