Skip to main content
Vesper Digital Contact

AI maturity model

Where your business sits today, and how it gets to AI productive.

A five-stage model for privately held companies. Use it to figure out where the business is right now, what moves it to the next stage, and what an outside engagement adds.

The five stages

From Ad Hoc to Leading.

Map your business to the stage that best matches its weakest dimension, not its strongest. A company with one strong pilot but no data infrastructure is Stage 2, not Stage 3.

  1. Stage 1

    Ad Hoc

    No intentional AI strategy. Employees may use ChatGPT or Copilot individually, but there's no visibility into it, no process, and no measurement. Excel still drives most decisions and data lives in silos.

    Key signals

    • Owner first heard about AI from a trade publication or peer
    • No dedicated AI or automation budget line
    • IT (if it exists) is in reactive maintenance mode
    • Pain points are well-known but workarounds are the norm

    What moves to Stage 2: Owner awareness plus willingness to dedicate even a small amount of time to exploring one use case.

  2. Stage 2

    Aware

    Someone in the company has started experimenting with AI tools. There may be one or two informal pilots — ChatGPT for emails, a basic chatbot, an AI module inside an existing system. No formal process, no ROI tracking, no rollout plan.

    Key signals

    • Owner has personally tried at least one AI tool
    • One or two employees using AI independently
    • No budget formally allocated, but the owner is open to discussion
    • Data cleanup or integration is recognized as a prerequisite but has not started

    What moves to Stage 3: A defined use case with a clear ROI target, and an internal owner accountable for making progress.

  3. Stage 3

    Developing

    One or two active AI initiatives with real ownership. A vendor is engaged or being evaluated, a budget is allocated (even if small), and someone internally is accountable for the outcome. Early wins may be live or in final piloting.

    Key signals

    • At least one AI tool generating measurable output — time saved, cost reduced, volume handled
    • Owner directly involved in at least one initiative
    • Basic data infrastructure exists, even if not yet clean
    • 90-day milestone thinking is in place

    What moves to Stage 4: Translating pilot success into a replication model for other departments or use cases.

  4. Stage 4

    Scaling

    Multiple AI use cases live across two or more departments. Results are measured and reported. Integration with core systems (ERP, TMS, CRM) is at least partially in place. AI is becoming part of how the business operates.

    Key signals

    • Three or more AI tools actively used across the business
    • ROI is tracked and reported to ownership
    • AI is part of onboarding or standard operating procedure in at least one function
    • Owner is looking at what to automate next, not what to try first

    What moves to Stage 5: Optimization and continuous-improvement loops. AI decisions tend to be reactive at this stage.

  5. Stage 5

    Leading

    AI is embedded in daily operations and drives competitive advantage. Continuous-improvement loops are in place. The company can demonstrate specific revenue or margin advantage attributable to AI. Leadership is cited in their industry as a technology adopter.

    Key signals

    • AI contributes measurably to revenue growth or cost efficiency at the company level
    • Internal team can evaluate, onboard, and operate new AI tools without heavy external support
    • Feedback loops are in place — output quality is measured and acted on
    • The company is starting to build or customize proprietary tooling

    Ongoing risk: Staying current as models and tools evolve. A competitive advantage built on a specific tool can erode quickly.

Stages 1 and 2 — the DIY zone

Where most companies stall.

Most stage 1 and 2 companies try a DIY path first: assign AI to internal IT, partner with a single SaaS or Copilot vendor, see what happens. The path is reasonable in principle. The failure modes are predictable.

  • Vendor selection thrashes. With no benchmark for what "good" looks like in your industry, the comparison runs three to six months and usually closes on whoever pitched last.
  • Integration burden lands on lean IT. Your IT team is already keeping the business running. Asking them to also evaluate, integrate, and operate a new AI workflow on top of that is how internal projects stall mid-build.
  • Opportunity cost compounds. A six-month rollout instead of a six-week one is a half-year of revenue or efficiency unrealized, and a half-year of competitors who shipped first.

The analytical depth shown on the founding engagement page is what an outside engagement adds: vendor research that compares against operators of your size, an integration architecture that respects what IT can absorb, and a commercial structure that locks the timeline.

How to use this model

For self-assessment, for roadmaps, for conversation.

Self-assessment. Pick the stage that matches your weakest dimension. Most companies score higher than they actually are because one strong pilot masks weaknesses in data, ownership, or rollout planning.

Roadmaps. Each phase of work should advance one stage. Two-stage jumps in a single phase are usually unrealistic for a privately held company.

Conversation. Use stage names rather than numbers. "You're at Stage 2 — Aware" lands. "You're a 2 out of 5" reads like a grade.

Stage 1 to 2 is owner-dependent. If the owner isn't bought in, no amount of internal enthusiasm moves the needle. Address buy-in first.

Stage 3 to 4 is data-dependent. Most companies stall here because their data isn't clean or integrated enough to scale the pilot. Flag this early.

Where Vesper Digital fits

An outside engagement compresses the stage transitions.

The Assessment maps your business to a stage and produces a 12-month roadmap to the next one. Fixed scope, two-week baseline.

The Implementation is a fixed-fee build for the use cases the roadmap prioritizes. Eight-week baseline.

Operating Support extends month-to-month after Implementation if you want it. Most companies absorb operations in-house after the 30-day post-launch window.

Ready to talk?

A first conversation is 30 minutes. We'll tell you whether your profile lines up, and what an engagement would look like if it does.