Description
Ollama makes it easy to get up and running with large language models locally.Examples
Dependency:
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>ollama</artifactId>
<version>1.20.0</version>
<scope>test</scope>
</dependency>
Usage:
var ollama = new OllamaContainer("ollama/ollama:0.1.26");
ollama.start();
ollama.execInContainer("ollama", "pull", "all-minilm");
Dependency:
go get github.com/testcontainers/testcontainers-go/modules/ollama
Usage:
ollamaContainer, err := ollama.Run(ctx, "ollama/ollama:0.1.26")
if err != nil {
log.Fatalf("failed to start container: %s", err)
}
_, _, err = ollamaContainer.Exec(ctx, []string{"ollama", "pull", "all-minilm"})
Dependency:
npm install @testcontainers/ollama --save-dev
Usage:
const container = await new OllamaContainer().start();
Dependency:
pip install testcontainers[ollama]
Usage:
with OllamaContainer() as ollama:
endpoint = ollama.get_endpoint()