Quack
A native macOS AI chat client that connects to multiple LLM providers from a single interface, with full support for MCP (Model Context Protocol) tool use.

Features
- Multi-provider support -- Chat with models from OpenAI, Anthropic, Google Gemini, Vertex AI, Ollama, Apple Intelligence (on-device), OpenRouter, Groq, Together, Mistral, and any OpenAI-compatible endpoint.
- MCP integration -- Connect external MCP servers for tool use with a three-tier permission model (Always Allow, Ask, Deny) and per-session server selection.
- Assistants -- Create reusable presets that bundle a provider, model, system prompt, parameters, and MCP servers together.
- Chat management -- Persistent conversation history, session pinning, archiving, search, and per-session model/parameter overrides.
- Streaming responses -- Live token streaming with reasoning/thinking model support and collapsible reasoning display.
- Markdown rendering -- Full CommonMark rendering of LLM output including code blocks, tables, lists, and more.
- Secure credentials -- API keys stored in the macOS Keychain.
- Auto-updates -- Built-in update mechanism via Sparkle.
Requirements
- macOS 26.0 or later
- Xcode with Swift 6.0 support