Apfel Unlocks Free Apple Intelligence on Your Mac
Apple ships a free LLM on every Mac. Apfel is the tool that finally lets you use it without Siri or cloud APIs. Local AI just got real.
The Free AI Already Sitting on Your Mac
Every Apple Silicon Mac comes with a built-in Large Language Model. Apple calls it "Apple Intelligence," but they locked it behind Siri - a voice assistant most developers never open. A new open-source tool called apfel just changed the game by unlocking this AI for direct use.
What is Apfel?
Apfel is a CLI tool, OpenAI-compatible server, and chat interface that talks directly to Apple's on-device LLM. No API keys. No cloud. No per-token billing. The AI is already installed on your Mac - apfel just gives you access to it.
Key features:
- 100% on-device processing
- Zero cost (no subscriptions)
- OpenAI API compatible
- Works via CLI, server, or chat interface
- Install with:
brew install Arthur-Ficial/tap/apfel
Why This Matters for Developers
For months, developers have been paying OpenAI, Anthropic, and Google for API access to models like GPT-4 and Claude. Meanwhile, a capable LLM was sitting unused on their MacBooks. Apfel exposes this resource for:
- Local development - Test prompts without API costs
- Privacy-sensitive work - Data never leaves your machine
- Offline coding - AI assistance without internet
- Cost reduction - Zero inference costs
How It Works
Apple's Neural Engine powers the on-device model. Apfel acts as a bridge, translating between the model and standard interfaces developers already use. The tool supports:
- CLI mode - Pipe text directly to the model
- Server mode - OpenAI-compatible HTTP endpoint
- Chat mode - Interactive terminal interface
The Bigger Picture
This represents a shift in how we think about AI infrastructure. Cloud APIs won't disappear, but local inference is becoming viable for an increasing number of tasks. With tools like apfel, Ollama, and llama.cpp, developers now have genuine alternatives to cloud-only AI.
The hardware is already in our hands. The software is catching up. The economics are compelling. Local AI isn't the future - it's already here, sitting on your desk, waiting to be unlocked.
Originally discussed on Hacker News. Posted on ketchalegend blog.