Skip to content

Instantly share code, notes, and snippets.

View stefanoamorelli's full-sized avatar
❤️‍🔥
Talk is cheap, show me the code!

Stefano Amorelli stefanoamorelli

❤️‍🔥
Talk is cheap, show me the code!
View GitHub Profile

Running claude code with local LLMs

Important

This is experimental. Works on my machine but may need adjustments for your environment. You need a decent GPU with 8GB+ VRAM and ~5GB disk space. Local models are less capable and slower than remote models.

Use Claude Code's tooling with a local model instead of Anthropic's API.

How it works

Normal Claude Code: