lessClick
Article

AI Consulting for Professional Services Firms: What Actually Works for 5–50 Person Teams

Most small firms do not have an AI problem. They have a process problem with AI bolted on top. Here is what the engagement should actually look like.

2026-04-17  ·  11 min read  ·  by Godfrey Tundube

Last month I looked closely at thirty small professional services firms. Mostly law, a few accounting, a couple of recruiters. Five to fifty people each.

Every one of them said they had an AI problem.

Not one of them actually did.

What they had was a process problem with AI bolted on top. An intake problem. A handoff problem. A status-update problem. A document-collection problem. AI had become the thing they were told to buy to fix the other thing. But the other thing was not software. It was operating system.

If you run a professional services firm and you are evaluating AI consulting, this piece is for you. It is written from the point of view of someone who has sat inside enough of these firms to know where the money actually bleeds out, and what kind of help is worth paying for.

Why AI Consulting for Professional Services Firms Usually Fails

The problem is not that AI does not work in professional services. It works fine. The problem is how the consulting engagement is shaped.

Most AI consulting for professional services firms follows the same pattern. A consultant or team arrives. They run discovery interviews. They produce a maturity assessment. They give you a deck with three pillars, a quadrant with your firm name in the corner, and a roadmap.

Then they tell you to "engage a specialist vendor for the implementation phase."

Which means you now need a second consultant to do the actual work.

Meanwhile the bottleneck that prompted the whole engagement, the thing the managing partner felt every week, is still sitting there untouched.

This happens because strategy and execution got separated in consulting a long time ago. The person who diagnoses the problem is not the person who builds the fix. Something always gets lost in the handoff. You end up with strategists who do not know how systems actually get built, and builders who do not know why they are building what they are building.

Nothing ships because nobody owns the full loop.

The Real Problem Is Not AI. It Is Process.

Here is what I see in almost every firm I look at:

The managing partner is the routing layer. Every new enquiry goes through one person because no system captures the logic they carry in their head. When they are busy, intake slows. When they are on holiday, it stops entirely.

Matter updates live in inboxes. Clients chase status instead of receiving it. Fee-earners spend an hour a day replying to "any news?" emails.

Document collection runs on memory and follow-up emails. Asset schedules, ID verifications, engagement letters. Clients send the wrong version, three threads end up with three slightly different document sets, and nobody has a single view of who is actually cleared to proceed.

Conflict checks run from memory. Not because the firm does not care. Because the system never made it easy to do properly.

Onboarding asks for the same details three times. Different people, different emails, different formats.

None of these are AI problems. They are operating system problems. AI layered on top of any of them would make things worse, not better, because AI amplifies whatever process it touches. Good process becomes great. Broken process becomes expensive and fast.

The 10 Operational Leaks Framework

When I audit a firm, I look for these ten leaks. If you recognise your firm in three or more of them, you do not need an AI strategy. You need an operational audit.

  1. Partner-as-routing-layer. Every enquiry flows through one person's head.
  2. Intake logic in memory. The questions asked on the intro call vary by who picks up.
  3. Handoff fog. When work moves from one person to another, nobody is sure who owns it.
  4. Status-update tax. Fee-earners spend 20 to 40 minutes a day replying to "any news?"
  5. Document chase loops. Onboarding documents take 2 to 4 weeks to collect.
  6. Conflict checks from memory. No deterministic workflow.
  7. Triple intake. The client gives the same details to three different people.
  8. Email-as-database. The source of truth for a matter is a named inbox thread.
  9. Approval drift. Simple approvals wait 48+ hours because nobody designed the queue.
  10. Silent escalation failure. When something goes wrong, nobody is paged.

If you want the expanded version, the 10 Operational Leaks guide walks through each one with diagnostic questions and cost estimates.

Every leak on that list is a process problem first and a technology problem second. The reason firms try to fix these with AI is that AI is the word their peer group uses. The word their vendors pitch. The word their board hears in the media.

The more honest word is architecture.

What Good AI Consulting Actually Looks Like at This Size

If you run a five-to-fifty person firm, the consulting model used by the Big 4 is not built for you. It was built for Microsoft rolling out a capability across forty thousand people. Six senior people in suits, an eighteen-month roadmap, a maturity curve.

You do not have eighteen months. You have a managing partner who is tired of being the routing layer, and some clients who are starting to feel the friction, and a competitor who just got a decent-looking client portal live last quarter.

At your size, the right engagement looks different. One operator, close loop, no handoffs.

Here is the shape of it.

Stage 1. Diagnose the bottleneck

Where does work actually get stuck? Not where does work happen. Where does it get stuck. The answer is almost always a handoff, the moment where one person finishes and the next person is supposed to pick up, and there is friction in the middle. Map those moments. Three to five of them is usually enough.

Stage 2. Design the flow

For each bottleneck, define the inputs, the owner, the decision logic, the outputs, and the escalation path. This is the piece most firms have never done. The process became what one person does on a Tuesday afternoon. That works until you hit fifteen people.

Stage 3. Ship the fix

Built. Wired into your existing tools. Documented. Running. Not a roadmap to a build. The build. Usually in four to six weeks.

The critical difference from a typical AI consultancy is that the person designing the flow is the person shipping the fix. The strategy adjusts to what is actually possible. The build adjusts to what actually matters. You end up with something you can use on a Wednesday, not something you have to re-commission in six months.

How to Tell If a Firm Is Ready for AI

Not every firm is ready. Some need to fix their process first. Some need to hire a person before they automate anything. Some need to clean up their data.

Here is a simple readiness test.

You are ready for AI if:

  • You have a clear picture of where work gets stuck weekly
  • You can name the three handoffs that waste the most time
  • You have one decision-maker who can green-light a build without committee
  • Your tools (case management, CRM, finance) are stable enough to integrate with

You are not ready for AI if:

  • You cannot describe the flow of a new client from enquiry to engagement letter
  • Intake logic lives exclusively in one person's head
  • Three different partners would answer the same process question differently
  • You are already on a "digital transformation" programme with another vendor

If the second list describes your firm, the first spend should not be on AI. It should be on operational architecture. Clean the flow. Then layer the intelligence on top.

What To Ask a Prospective AI Consultant

Most of the AI consulting market is oriented around selling you a product, a subscription, or a roadmap. A few simple questions cut through it.

  1. "Will the person diagnosing the problem also be the person building the fix?" If no, you are about to pay twice.
  2. "What does your deliverable look like in four weeks?" If the answer is a deck or a roadmap, you are in theatre.
  3. "Can you show me something you have shipped for a firm my size?" Not logos. A screenshot of a working system.
  4. "If I engage you and the fix is not AI, what happens?" If the answer is awkward silence, they are a tool vendor with consultant framing.
  5. "How do you scope a first engagement?" If it is a six-month discovery, walk. If it is a two-week diagnostic followed by a scoped build, keep talking.

The last one is most revealing. Firms that can diagnose a professional services operation in two weeks have seen enough of them to know what to look for. Firms that need six months have not.

The Positioning Shift Worth Making Internally

If you are the managing partner reading this, there is one positioning shift that makes the AI question much easier to answer.

Stop asking "how do we use AI?"

Start asking "where is the handoff failing?"

The first question leads to tool shopping. The second leads to architectural thinking. And once you have done the architectural thinking, the AI question answers itself. You will know which handoffs need a rule, which need a form, which need a queue, and which actually benefit from intelligence.

Half the things firms try to solve with AI right now do not need AI at all. They need a properly designed form and a sensible rule. The remaining half become much cleaner to build once the process underneath them is sound.

AI is not the differentiator anymore. Architecture is.

How lessClick Works With Firms Like Yours

We are not an AI consultancy. We are an operational architecture practice that happens to ship AI and automation as the final layer. The work moves in three stages.

Diagnose. A two-week engagement. We map your current flow, identify where work gets stuck, and tell you where the highest-leverage fix lives. You get a written recommendation with a scoped cost, a scoped time, and a specific decision on process versus tool versus decision-rights.

Design. For the chosen fix, we produce the flow spec. Inputs, owners, decision logic, outputs, escalation paths. This is the piece most firms have never commissioned, and it is usually the highest-leverage deliverable in the engagement.

Ship. We build it, wire it into your existing stack, document it, and hand it off running. Four to six weeks for a typical scope. Thirty days of post-launch stabilisation included.

One operator. Closed loop. No handoffs.

Start With the Diagnostic

If you have read this far and some of it describes your firm, the next step is the two-week diagnostic. £5K. You get a written operational audit, a map of the leaks, and a scoped recommendation of the single highest-leverage fix.

About 60% of diagnostics convert into build engagements. The ones that do not still get a useful artifact: a clear picture of where their operating system is breaking, and a specific plan for fixing it. That alone pays back the engagement.

AI consulting for professional services firms, done properly, starts with honesty about what the problem actually is. For most five-to-fifty person firms, the problem is not AI. The problem is that the process was never designed. Once you have designed it, the rest is just wiring.