Skip to content

Instantly share code, notes, and snippets.

@daveio
Forked from intellectronica/0.README.md
Last active December 10, 2025 15:17
Show Gist options
  • Select an option

  • Save daveio/6e8d0edd82b575ff105329050bbefabe to your computer and use it in GitHub Desktop.

Select an option

Save daveio/6e8d0edd82b575ff105329050bbefabe to your computer and use it in GitHub Desktop.
Global Just Recipes for Quick AI in the Terminal

Global Just Recipes for Quick AI in the Terminal

The snippet above is from my global ~/.justfile. I also have abbr -a j 'just' in my Fish configuration.

The original version streamed the output. This version sacrifices that streaming for pretty output formatting with charmbracelet/glow. It pipes the output Markdown through prettier/prettier to make up for any wonky formatting; Claude in particular likes to skip blank lines after headings.

This means that charmbracelet/glow and prettier/prettier are dependencies, alongside any AI backends like github/copilot-cli or anthropics/claude-code.

The Gemini and Codex CLI backends are not yet implemented.

Recipes are provided for Claude Haiku (default for the ai alias) and GPT-5 Mini via Copilot (consumes 0 premium requests).

flowchart LR
 subgraph s1["Claude Code"]
        ask-claude["ask-claude"]
        ask-claude-haiku["ask-claude-haiku"]
        n1["ask-claude-sonnet"]
        n2["ask-claude-opus"]
  end
 subgraph s2["Copilot CLI"]
        ask-copilot["ask-copilot"]
        ask-copilot-gpt-5-mini["ask-copilot-gpt-5-mini"]
  end
 subgraph s3["Gemini CLI"]
        ask-gemini["ask-gemini"]
        n3["ask-gemini-pro"]
        n4["ask-gemini-flash"]
  end
 subgraph s4["Codex CLI"]
        ask-codex["ask-codex"]
        n5["ask-codex-standard"]
        n6["ask-codex-max"]
  end
    just["just"] --> ask["ask"]
    ask --> ask-claude & ask-copilot & ask-gemini & ask-codex
    ask-claude --> ask-claude-haiku & n1 & n2
    ask-copilot --> ask-copilot-gpt-5-mini
    ask-gemini --> n3 & n4
    ask-codex --> n5 & n6

    style n1 color:#616161,stroke-width:1px,stroke-dasharray: 1
    style n2 color:#616161,stroke-width:1px,stroke-dasharray: 1
    style ask-gemini stroke-width:1px,stroke-dasharray: 1,color:#616161
    style n3 stroke-width:1px,stroke-dasharray: 1,color:#616161
    style n4 stroke-width:1px,stroke-dasharray: 1,color:#616161
    style ask-codex stroke-width:1px,stroke-dasharray: 1,color:#616161
    style n5 stroke-width:1px,stroke-dasharray: 1,color:#616161
    style n6 stroke-width:1px,stroke-dasharray: 1,color:#616161
Loading

For example:

$ j ai show me how to extract the audio from myvideo.avi into myaduio.mp3 using ffmpeg

I alias ai to ask-claude-haiku - it's a great default for most simple tasks - fast and smart enough. You can easily change this to ask-copilot-gpt-5-mini if you want a free solution.


Happy Commandlining!

🫶 Eleanor (@intellectronica) and Dave (@daveio)

set shell := ["fish", "-c"]
ask backend model +prompt:
@cd {{invocation_directory()}}; just ask-{{backend}} {{model}} "{{prompt}}" | prettier --parser=markdown | glow
ask-copilot model +prompt:
@cd {{invocation_directory()}}; copilot --model {{model}} --silent --allow-all-paths --allow-all-tools --stream on -p "{{prompt}}\\n\\nOutput Markdown." 2>/dev/null
ask-claude model +prompt:
@cd {{invocation_directory()}}; claude --model {{model}} --permission-mode bypassPermissions --dangerously-skip-permissions -p "{{prompt}}\\n\\nOutput Markdown." 2>/dev/null
ask-gemini model +prompt:
@cd {{invocation_directory()}}; echo NOT YET IMPLEMENTED
ask-codex model +prompt:
@cd {{invocation_directory()}}; echo NOT YET IMPLEMENTED
ask-copilot-gpt-5-mini +prompt:
@cd {{invocation_directory()}} && just ask copilot gpt-5-mini "{{prompt}}"
ask-claude-haiku +prompt:
@cd {{invocation_directory()}} && just ask claude haiku "{{prompt}}"
alias ai := ask-claude-haiku
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment