The AI CoE Trap
Why Your AI Center of Excellence Is Slowing You Down. Eleven weeks for a meeting. Five months for code. One business unit that stopped waiting.
The head of operations at a major insurance carrier waited eleven weeks for a meeting.
Not strategy. Not politics. Just a scoping session with the AI Center of Excellence.
She had a claims workflow ripe for AI: three intake steps that could free adjusters for real judgment work. Process maps ready. Data available. Vendor shortlist vetted. She needed the CoE to confirm architecture and governance fit.
Eleven weeks.
The CoE lead apologized: demand outpaced capacity. New prioritization framework coming. Intake process: business case submission, three‑week portfolio committee review, then eight to twelve weeks for squad assignment.
Five to six months total.
She didn’t wait. Bought a SaaS tool. Tested it. It worked. The CoE never knew.
This isn’t about bad teams. It’s about bad structure.
Centralization’s Promise—and Its Poison
“Gather talent, enforce standards, manage risk.”
That logic built shared services. It’s breaking AI.
Enterprises default to the CoE because AI feels specialized. Scarce skills. Real risks. Centralize to scale.
But centralization does two things: concentrates expertise and excuses the rest of the organization from building it.
Business units learn queues, not models. Requests, not experiments. They become customers, not practitioners. A 30‑person CoE can’t embed AI everywhere. Only distributed fluency can—and the CoE blocks it.
Queues Are the First Crack
Demand feels like success. It’s actually rationing.
High backlog? Momentum! Wrong. Finite CoE capacity forces every unit through one door. Queues form. Waits grow.
Outcomes:
Helplessness: teams accept the line.
Shadow AI: teams buy tools quietly. Deploy. Learn. Stay invisible.
Governance misses shadow work because it can’t see it. And queued teams stop submitting ideas—they’ve learned it’s not worth the fight. Pipeline shrinks. CoEs call it “idea exhaustion.”
The Capability Trap
Two years in: elite center, naive edges.
CoE data scientists excel. ML engineers harden models. Business partners? A few lunch‑and‑learns and newsletters.
Generative AI hits. Distributed orgs pivot fast—their people knew enough to experiment. Centralized ones lag: CoE retrains, reorients, restarts. Workforce waits.
Centralize the function, centralize the learning. Enterprise stays AI‑naive, now dependent on its specialists.
The Center Protects Itself
Metrics reward delivery. Not obsolescence.
CoE KPIs: use cases shipped, savings claimed, compliance hit. Nothing measures distributed skill.
Teams optimize what’s measured. Governance thickens. Standards demand central nods. Dependency deepens—indistinguishable from “success.”
Structure breeds this. Stable teams claim knowledge monopolies. No arrogance required.
DevOps Called This
Proof: software delivery data.
DORA’s decade of research: top performers use communities of practice, templates, edge replication. CoEs? POC hell, then stall.
RPA programs repeated the pattern. Winners embedded capability early. Losers centralized delivery.
Why CoEs Persist
Legibility over scale.
Finance: one budget line vs. distributed mess.
Execs: clear ownership story.
Risk: central gatekeepers feel safe.
Result: AI as controlled project, not organizational muscle.
Fix It Now
Redefine the CoE’s mission.
Measure transfer, not output.
Track:
Embedded practitioners trained.
Independent deployments using CoE standards.
Self‑certified use cases.
Embed from day one. Recruit domain experts into business units. They bridge ops and AI.
Standards over approvals. Self‑certify against clear rules. Audit after.
Build infrastructure. Model libraries. Eval frameworks. Templates. Make replication easy.
The Test Question
If your CoE disappeared tomorrow, what AI work continues?
Little? You built dependency.
Winners measure less need for the center. Their units experiment. Practitioners connect. Governance enables speed.
That’s transformation. Not queues.


