Skip to main content

Run a model

ollama run gemma3

Launch integrations

ollama launch
Configure and launch external applications to use Ollama models. This provides an interactive way to set up and start integrations with supported apps.

Supported integrations

  • OpenCode - Open-source coding assistant
  • Claude Code - Anthropic’s agentic coding tool
  • Codex - OpenAI’s coding assistant
  • Droid - Factory’s AI coding agent

Examples

Launch an integration interactively:
ollama launch
Launch a specific integration:
ollama launch claude
Launch with a specific model:
ollama launch claude --model qwen3-coder
Configure without launching:
ollama launch droid --config

Multiline input

For multiline input, you can wrap text with """:
>>> """Hello,
... world!
... """
I'm a basic program that prints the famous "Hello, world!" message to the console.

Multimodal models

ollama run gemma3 "What's in this image? /Users/jmorgan/Desktop/smile.png"

Generate embeddings

ollama run embeddinggemma "Hello world"
Output is a JSON array:
echo "Hello world" | ollama run nomic-embed-text

Download a model

ollama pull gemma3

Remove a model

ollama rm gemma3

List models

ollama ls

Sign in to Ollama

ollama signin

Sign out of Ollama

ollama signout

Create a customized model

First, create a Modelfile
FROM gemma3
SYSTEM """You are a happy cat."""
Then run ollama create:
ollama create -f Modelfile

List running models

ollama ps

Stop a running model

ollama stop gemma3

Start Ollama

ollama serve
To view a list of environment variables that can be set run ollama serve --help