Crush (Charm Crush CLI) Screenshot

Introduction

Crush (Charm Crush CLI) is a terminal-first, open-source AI coding assistant built with Charm (Charmbracelet) tooling. It brings large language model power into the CLI so developers can ask, modify, run, and refactor code without leaving the terminal. Crush emphasizes session persistence, multi-model support, and deep code awareness (via LSP integrations) while keeping a polished, TUI (text UI) experience. The project is developed in Go and integrates tightly with the Charm ecosystem.

View on GitHub Charm Blog / Announcement Learn More

Key Features

Terminal-First AI Assistant: Interact with models directly from the CLI—ask for code, refactors, explanations, or run project tasks inside persistent sessions.
Multi-Model & Provider Support: Swap between OpenAI/Anthropic and other compatible models mid-session to compare outputs or manage costs.
Session Persistence & Histories: Multiple named sessions keep context, history, and conversation state so the agent remembers the repo and goals.
LSP (Language Server) Integration: Access real code context and semantics to produce more accurate completions, fixes, and navigation commands.
Extensible Protocols (MCPs): Model Context Protocols (http, stdio, sse) allow integrations with local tools, servers, and external services.
Cross-Platform CLI: Works on macOS, Linux, Windows (PowerShell / WSL) and BSD; built in Go for portability and speed.
Developer Workflows: Tight integration with common tools (git, docker, npm) enables code edits, commits, builds and tests from the same environment.

What It Does?

Crush acts as a powerful companion inside developer workflows:

  • Code generation & fixes: Generate functions, tests, or refactor code on demand and apply patches directly to files.
  • Context-aware suggestions: Use project context (via LSP) so the assistant can reason about types, imports, and file structure.
  • Session automation: Store conversational context, run commands, and replay common sequences across sessions.
  • Tool orchestration: Pipe model outputs into local tools or external services through MCPs for richer workflows.

How It Works?

1. Install & configure: Install Crush (binary or from source) and configure model providers (API keys for OpenAI/Anthropic/etc.). 2. Open a session: Start a named session inside your project—Crush loads context and optionally attaches an LSP. 3. Ask & apply: Ask Crush to write, explain, or fix code. Review the suggested patch and apply edits directly to the repo. 4. Integrate tools: Use MCPs to connect to local processes or external services for tests, builds, or data lookups. 5. Persist & iterate: Sessions save history so you can continue work later or spawn parallel sessions for experimentation.

Use Case & Target Audience

Use Case

  • Developers who want quick code generation, explanations, and one-step fixes without switching to a web UI.
  • Teams leveraging LLMs in CI/debug flows to triage test failures, generate patches, or create reproducible dev steps.
  • Open source maintainers who need an interactive assistant for repo onboarding and repetitive tasks.
  • Engineers who prefer working in the terminal and want aesthetic, keyboard-driven UIs.

Target Audience

  • Backend and CLI-first developers comfortable with terminal workflows.
  • DevOps and SRE teams who want scripted, reproducible assistant interactions.
  • Developers exploring multi-model strategies and local tool integrations.
  • Open source contributors and maintainers looking for context-aware helpers.

Pros and Cons

Pros

  • Designed to live in the terminal — reduces context switching for developers.
  • Deep code awareness through LSP makes outputs more precise and actionable.
  • Flexible: multi-model support and extensible protocols enable experimentation and resilience.
  • Open source — community contributions and transparency on implementation.

Cons

  • Some users report high resource / token usage depending on model choice—costs can add up for heavy usage.
  • Terminal UI and workflows may have a learning curve for users unfamiliar with TUI patterns.
  • Because it's powerful, a developer review of suggested patches is recommended for safety and security.

Final Thoughts

Crush (Charm Crush CLI) is an ambitious effort to put LLM capabilities where many developers already spend most of their time — the terminal. It excels at embedding AI into real dev workflows with context awareness, model flexibility, and extensibility. Try it on a small repo or in an experimental session to evaluate costs, behavior, and how it fits into your team's review processes.