45 Rule Files Were Making My AI Worse
I've been building with AI agents for months. Custom rules, workspace knowledge, skill files — honestly, probably too much stuff. I had 45 rule files telling my AI how to behave. Naming conventions. Code patterns. Anti-patterns. Deployment checklists. Security gates.
Forty-five files.
And the AI was getting worse.
It wasn't broken, but the output was getting sloppy in ways I couldn't pin down. It would confidently reference a file path that didn't exist anymore. It would apply a pattern from one rule while ignoring a contradicting rule three files later. Code that technically followed the instructions but missed the point entirely.
I kept adding more rules to fix the problems the other rules were causing. Classic.
The measurement
Then I actually measured it. Each session was loading roughly 170KB of rule context — about 25,000 tokens — before I even said anything.
That's like handing someone a 50-page employee handbook and then asking them to fix a bug. They skim. They miss stuff. And whatever they don't see, they guess.
There's a well-known psychology concept called Miller's Law — working memory can only hold about four to seven things at once before it starts dropping stuff. Context overload isn't just an AI problem. It's a human one.
I think that's what hallucination actually is a lot of the time. Not the AI being dumb — just the AI drowning in your instructions and filling in the gaps.
What I actually cut
So I made myself delete half of them.
Not randomly. I went through every rule file and asked: is this contradicting another rule? Is there fragmentation — two files saying similar things slightly differently? Is this actually reducing hallucination risk, or just making me feel like I covered my bases?
If it was redundant, it got cut. If two rules overlapped, I merged them. If a file was longer than 5KB, it got split into smaller pieces that only load when relevant.
45 files became 30. 170KB became 72KB. ~25,000 tokens saved per session.
The output quality improved immediately. Not because the rules were bad — because fewer rules meant the AI could actually follow the ones that mattered.
The lesson that keeps repeating
At some point, more structure just becomes more noise. The governance I built to prevent mistakes was creating new ones. The fix wasn't better rules. It was fewer of them.
The rule cleanup saved me about three days of accumulated drift. Not bad for deleting files.