Goodbye Claude, Hello Pi!
We all know the struggle. We reach limit after limit when using tools like Claude. It’s a fantastic tool, and it has done many great things for me already. But whenever I start working with Claude, I usually always reach the limit within the first hour or two. While it increases my productivity as the excuses we keep making for ourselves are mostly resolved by AI, it also decreases it by the frequent limit bottleneck.
After the third time of hitting my limit, I started looking in the direction to work with local LLM’s. When I found that you can make Claude Code work with local LLM’s I was beyond excited. But as I didn’t have any experience at that time with that, I quickly found out that this was not a golden goose after all.
A few more limits later, I found Pi. And I’ve never looked back at Claude Code.
A quick, yet important disclaimer
I’m not like one of those tech influencers that want to oversell something. Claude is still one of (if not) the best AI tools there is on the market right now. While using Pi, I use a Copilot license and I use Claude’s models with Pi. My goal with this article is to open your perspective that there are other tools available that comes with its own set of benefits over Claude. That’s my end goal, make you aware of other tools and why you should give them a try.
What is Pi?
Pi is coding agent that lives in the terminal. It’s a barebones tool. This means that there are tools that you know and perhaps use in Claude that are not available in Pi. And that is on purpose.
Meet, Extensions
At first, I was skeptical due to the lack of features like subagents for example. Yet after some quick research, I understand why it’s omitted from the project. The goal is to stay barebones and to keep it lean. If you want something, you can create an extension yourself. But, chances are, that someone else needed something you want, so you can also check the marketplace. There are already many extensions available (including extensions that add subagents etc.).
There are even extensions that tools like Claude or Copilot don’t even have! They might not be useful for everyone, but they might just be the thing you lacked all this time.
Extensions, skills, prompt templates, and themes published to npm. Install with pi install npm:<package>. See the package docs for details.
How I made Pi my own
Because it’s so highly customizable, I decided to truly make it my own. With the aid of Pi itself, I started prompting my ideas and see my custom extensions come to life. Because of AI I was able to quickly develop multiple extensions each with its own purpose!
No plan mode, no problem
I use Plan mode in Claude/Copilot all the time, yet I didn’t like the result I got from it every time. After completing Matt Pocock’s course, Claude Code for Real Engineers, I immediately wanted to implement that flow in Pi. So instead of re-implementing a plan mode, I asked Pi to create a PRD mode based off Matt’s /write-a-prd skill. A few minutes later, my custom PRD mode was implemented and it just fits perfectly in my current workflow.
Learn to use Claude Code for Production Grade Software Engineering
What is a “todo”?
The concept of internal todo’s for the AI is not implemented by default in Pi. Not a problem, I always wanted to have more control over those todo’s anyway. I’ve tried Backlog.md for a project and loved it, especially when collaborating with AI. After the PRD mode, I asked Pi to natively implement Backlog.md, and again, in a very short time, I was able to use it natively within Pi. Now I can visually see in my browser all the tasks and I can follow live what my AI is doing.
Markdown‑native Task Manager & Kanban visualizer for any Git repository
You’ve reached your limit
The most hated sentence of every developer in 2026 that uses Claude for coding. With Pi, you can connect different providers. So not only can you use the API from Anthropic, you can also use OpenAI or OpenRouter (which has access to a lot of different LLM’s).
What makes this particularly interesting is that no matter what LLM you use from any provider, your tool stays the same. It’s even possible to use local LLM’s. Although the results may not be near as good as when you use the pioneer models, you can still do some lightweight work with the same tool.
You have no vendor lock-in, and all the tools you make your own are re-usable with different LLM’s. In the current AI landscape, I love this option and makes it for that reason alone a no-brainer to use Pi instead.
Make it more me
I’m a geek, and I when I can customize my tools, I make it geeky like me. So instead of showing a default “Working…” indicator, I now get a randomized geeky themed indicator instead. Does this solve a problem I had? No. Did it increase my productivity? Also, no. Then why did you do it? Because I could.