Skip to content

Instantly share code, notes, and snippets.

@iam-veeramalla
Last active March 2, 2026 20:30
Show Gist options
  • Select an option

  • Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.

Select an option

Save iam-veeramalla/d0f46791619b0db348d8312060a80f2d to your computer and use it in GitHub Desktop.
claude code integration with ollama to use local models

Run Claude with the power of Local LLMs using Ollama

Install Ollama

curl -fsSL https://ollama.com/install.sh | sh

Pull the Model

ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance)

Install Claude

curl -fsSL https://claude.ai/install.sh | bash

Run Claude with Ollama

ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@chinthakindi-saikumar
Copy link

Here are the clear steps @vinoth-6 .

1.Install Ollama
Open CMD/terminal and run below command curl -fsSL https://ollama.com/install.sh | sh
2.Pull the Model
Install model based on your system configuration using belw commands ollama pull glm-4.7-flash # or gpt-oss:20b (for better performance) or ollama pull gemma:2b
*.Optional: ollama run gemma:2b and work in your local
3.Install Claude
Install Claude using below commands macOS, Linux, WSL: curl -fsSL https://claude.ai/install.sh | bash Windows CMD: curl -fsSL https://claude.ai/install.cmd -o install.cmd && install.cmd && del install.cmd
4.Run Claude with Ollama
Launch Claude using below commands ollama launch claude --model glm-4.7-flash # or ollama launch claude --model glm-4.7-flash gpt-oss:20b

@Nestleai
Copy link

I've followed this directive and all works but claude, one the first test command "write a python function to reverse a script" its been hyperspacing for 10mins+ Bare in mind i'm working the qwen2.5-coder:7b model. could this mean the model isnt comaptible with my hardware?

@NathanLewis
Copy link

I tried it but claude can't see the local filesystem not even the files in the directory I run it from.

@eshwarvijay
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest
Error: unknown command "launch" for "ollama"

@JenilSavalia
Copy link

eshwar: ollama launch claude --model qwen3-coder-next:latest Error: unknown command "launch" for "ollama"

same error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment