Skip to main content
Mux supports multiple AI providers. Configure provider API keys in Settings → Providers — see Providers for setup.

First-class Models

Mux ships with curated models kept up to date with the frontier. Use any custom model with /model <provider:model_id>.
ModelIDAliasesDefault
Opus 4.6anthropic:claude-opus-4-6opus
Sonnet 4.5anthropic:claude-sonnet-4-5sonnet
Haiku 4.5anthropic:claude-haiku-4-5haiku
GPT-5.2openai:gpt-5.2gpt
GPT-5.2 Proopenai:gpt-5.2-progpt-pro
GPT-5.2 Codexopenai:gpt-5.2-codexcodex
GPT-5.3 Codexopenai:gpt-5.3-codexcodex-5.3
GPT-5.1 Codexopenai:gpt-5.1-codexcodex-5.1
GPT-5.1 Codex Miniopenai:gpt-5.1-codex-minicodex-mini
GPT-5.1 Codex Maxopenai:gpt-5.1-codex-maxcodex-max
Gemini 3 Pro Previewgoogle:gemini-3-pro-previewgemini, gemini-3, gemini-3-pro
Gemini 3 Flash Previewgoogle:gemini-3-flash-previewgemini-3-flash
Grok 4 1 Fastxai:grok-4-1-fastgrok, grok-4, grok-4.1, grok-4-1
Grok Code Fast 1xai:grok-code-fast-1grok-code

Model Selection

Keyboard shortcuts:
  • Cycle models
    • macOS: Cmd+/
    • Windows/Linux: Ctrl+/
To choose a specific model, click the model pill in the chat footer. Alternatively, use the Command Palette (Cmd+Shift+P / Ctrl+Shift+P):
  1. Type “model”
  2. Select “Change Model”
  3. Choose from available models
Models are specified in the format: provider:model-name

One-shot Overrides

Override the model or thinking level for a single message using slash commands. The override applies only to that message — workspace settings stay unchanged.

Syntax

CommandEffect
/sonnet explain this codeUse Sonnet for one message
/opus+high deep reviewUse Opus with high thinking
/haiku+0 quick answerUse Haiku at its lowest thinking level
/+2 analyze thisKeep current model, set thinking level 2

Thinking levels

Append +level to any model alias. Levels can be named (off, low, medium/med, high, max) or numeric (09). Numeric levels are model-relative — they map to the model’s allowed thinking range:
  • 0 = model’s lowest allowed level (e.g., off for Sonnet, medium for GPT-5.2 Pro)
  • Higher numbers select progressively higher levels, clamped to the model’s maximum
This means /sonnet+0 disables thinking while /gpt-pro+0 sets thinking to medium (GPT-5.2 Pro’s minimum). Use /+level (no model) to override thinking on the current model: /+0 quick answer

CLI

The mux run CLI accepts the same thinking levels via --thinking:
mux run -t 0 "Quick fix"          # Lowest thinking for the model
mux run -t high "Deep analysis"   # Named level

Next Steps

Configure Providers

Set up API keys for Anthropic, OpenAI, Google, and other providers.