Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save bguiz/2fdc24fbbe3631eb6f3463f0a99de397 to your computer and use it in GitHub Desktop.

Select an option

Save bguiz/2fdc24fbbe3631eb6f3463f0a99de397 to your computer and use it in GitHub Desktop.

How to configure kimi-cli to use Kimi 2.5 via OpenRouter

(1) Install kimi-cli:

(2) Edit/ replace ~/.kimi/config.toml with the provided config.toml file.

  • type needs to be openai_legacy, not openai
  • base_url must include the /api/v1 path
  • model must be moonshotai/kimi-k2.5 (note the additional k)

(3) Get API key

(4) test that it works

  • Run command kimi to enter the shell.
  • If the shell does not start and it exits immediately, there's something wrong with the config settings.
  • Inside the kimi-cli shell, do not run /login.
  • Instead type any prompt.
  • If you instead see Authorization failed, please check your login status, there's something wrong with the config settings.
  • Otherwise it should answer your question, and it's working!
# ~/.kimi/config.toml
default_model = "kimi_2_5_via_openrouter"
default_thinking = false
[providers.openrouter]
type = "openai_legacy"
base_url = "https://openrouter.ai/api/v1"
api_key = "sk-or-v1-[redacted]"
[models.kimi_2_5_via_openrouter]
provider = "openrouter"
model = "moonshotai/kimi-k2.5"
max_context_size = 160000
[loop_control]
max_steps_per_turn = 100
max_retries_per_step = 3
max_ralph_iterations = 0
reserved_context_size = 50000
[services]
[mcp.client]
tool_call_timeout_ms = 60000
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment