Skip to content

Instantly share code, notes, and snippets.

@iam-veeramalla
Last active March 2, 2026 20:30
Show Gist options
  • Select an option

  • Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.

Select an option

Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.
claude code integration with ollama to use local models

Run Claude with the power of Local LLMs using Ollama

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Pull the Model

ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)

Install Claude

curl -fsSL https://claude.ai/install.sh | bash

Run Claude with Ollama

ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@NathanLewis
Copy link

I tried it but claude can't see the local filesystem not even the files in the directory I run it from.

@eshwarvijay
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest
Error: unknown command "launch" for "ollama"

@JenilSavalia
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment