The free AI already on your Mac.
macOS Tahoe ships with a 3B parameter LLM. apfel gives you CLI access with one brew install. No model downloads, no API keys, no configuration needed, just works.
$ brew install Arthur-Ficial/tap/apfel
1.9k+ stars on github.com/Arthur-Ficial/apfel
Requires: Apple Silicon · macOS Tahoe · Apple Intelligence enabled
Everything runs on your device. No network calls. No data leaves your Mac.
From brew install to AI output in 12 seconds. No configuration required.
Other local AI tools are great but require model downloads and configuration. apfel needs nothing - the model is already on your Mac. One brew install. You're done.
A CLI, an OpenAI-compatible server, and an interactive chat. All from a single brew install. No extra downloads.
Pipe-friendly and composable. Works with jq, xargs, and your shell scripts. stdin, stdout, JSON output, file attachments, proper exit codes.
Drop-in replacement at localhost:11434. Point any OpenAI SDK at it and go. Streaming, tool calling, CORS, response formats.
Multi-turn conversations with automatic context management. Five trimming strategies. System prompt support. All on your Mac.
Give Apple's on-device model tools. Any tools.
apfel speaks the Model Context Protocol. Point it at any MCP server and the on-device model gets tools - math, APIs, databases, anything you can write a server for.
Not a single HTTP call while using apfel. Every claim below links to Apple's own documentation.
The FoundationModels framework runs entirely on your Apple Silicon Neural Engine. No server, no cloud, no internet connection needed.
Apple does not collect prompts, responses, or any usage data from on-device inference. Nothing is stored, nothing is sent.
Apple trains their foundation models on licensed and public data only. Your interactions are never part of the training pipeline.
No analytics, no telemetry, no automated update checks, no crash reporting, no calling home. The only network call apfel ever makes is when you explicitly run apfel --update. The entire codebase is open source.
Apple built an LLM into every Mac. It's just sitting there. apfel gives it a front door -with zero configuration.
Starting with macOS 26 (Tahoe), every Apple Silicon Mac includes a language model as part of Apple Intelligence. Apple exposes it through the FoundationModels framework - a Swift API that gives apps access to SystemLanguageModel. All inference runs on the Neural Engine and GPU. No network calls, no cloud, no API keys. The model is just there.
Out of the box, the on-device model powers Siri, Writing Tools, and system features. There is no terminal command, no HTTP endpoint, no way to pipe text through it. The FoundationModels framework exists, but you need to write a Swift app to use it. That is what apfel does.
apfel is a Swift 6.3 binary that wraps LanguageModelSession and exposes it three ways: as a UNIX command-line tool with stdin/stdout, as an OpenAI-compatible HTTP server (built on Hummingbird), and as an interactive chat with context management.
It handles the things Apple's raw API does not: proper exit codes, JSON output, file attachments, five context trimming strategies for the small 4096-token window, real token counting via the SDK, and conversion of OpenAI tool schemas to Apple's native Transcript.ToolDefinition format.
Shell scripts in the demo/ folder. Install apfel first, then grab the ones you want.
Natural language to shell command. Say what you want, get the command.
Pipe chains from plain English. awk, sed, sort, uniq - generated for you.
Narrates your Mac's system activity like a nature documentary.
Explain any command, error message, or code snippet in plain English.
What's this directory? Instant project orientation for any codebase.
Summarize recent git commits in a few sentences.
Change one URL. Keep your code.
apfel speaks the OpenAI API. Any client library, any framework, any tool that talks to OpenAI can talk to your Mac's AI instead. Just change the base URL.
from openai import OpenAI # Just change the base_url. That's it. client = OpenAI( base_url="http://localhost:11434/v1", api_key="unused" # no auth needed ) resp = client.chat.completions.create( model="apple-foundationmodel", messages=[{ "role": "user", "content": "What is 1+1?" }], ) print(resp.choices[0].message.content)
From zero to 1,909 stars and counting.
893 stars on April 3, 552 on April 4. Starred by engineers from Apple, Google, VMware, NVIDIA, and Grafana.
Star on GitHubData as of April 5, 2026
No model downloads. No sign-ups. No API keys. Just this.
$ brew install Arthur-Ficial/tap/apfel $ apfel "Hello, Mac"
$ git clone https://github.com/Arthur-Ficial/apfel.git $ cd apfel && make install
Questions scraped from Hacker News and Reddit.
--token), origin checking, debug-only log endpoints. Details in server-security.md.apfel --update (v0.7.7+). Detects install method, checks for newer version, prompts before upgrading. Manual: brew upgrade apfel or git pull && make install.--mcp <server> to connect tools. Tool calling guide.apfel --update, which checks Homebrew for a newer version. Open source and auditable.Tools built on top of Apple's on-device AI.
Native macOS SwiftUI debug GUI. Chat with Apple Intelligence, inspect requests and responses, logs, speech-to-text, text-to-speech - all on-device.
SwiftUIAI-powered clipboard actions from the menu bar. Fix grammar, translate, explain code, summarize - one click on any selected text.
Under Heavy Development