Skip to main content
← Back to Blog

How to Scope an AI Project Before You Quote It

Colin Gillingham··4 min read
ai-implementationai-consultingai-strategyfractional-aienterprise-ai

The number gets written down before anyone understands what they're building, and then it gets treated as an estimate when reality shows up. That's a guess dressed up as a commitment.

Most AI project overruns are framing failures, not engineering failures. The work was scoped before the right questions got asked.

The question that changes everything

Before scope, you need to understand the decision the AI is replacing, not just the task.

"We want to automate X" is a task. But automating X requires embedded judgment. Who is making it? How often? What happens when it's wrong? What data does the system need access to when it decides?

If you can't answer those questions before the contract, you're estimating fiction.

I've seen a "two-week AI integration" turn into a five-month rebuild because nobody asked what the system should do when it wasn't sure. That's a product question, not a model question. It should have come up in the first 20 minutes.

Where judgment lives determines everything

Scoping isn't about estimating engineering time. It's about deciding, explicitly, where human judgment sits in the system.

Judgment can sit before the AI as input or context, inside it as prompt structure, or after it as a review layer. Where it lives determines complexity, build time, testing approach, and what breaks in production. If you haven't named those positions explicitly, you haven't scoped the project.

Five questions I ask before any number is written:

  • What decision is the AI making, and what data does it have when it decides?
  • What does wrong look like, and what's the acceptable error rate?
  • Who owns the output when the AI is wrong?
  • Is the data needed to run this system available today?
  • What does a successful pilot look like in 60 days, and who judges it?

Those five questions either sharpen the scope or reveal you're not ready to scope yet. Both are useful answers.

The data problem lives here

Most AI projects discover their data problem halfway through. The data that was supposed to exist doesn't. What exists isn't in the right format. Labels aren't clean. Historical records stop at 2022.

A proper scoping process surfaces this before the contract is signed.

I run a lightweight data audit on every engagement before committing to a timeline. Not a full data engineering review, just 45 minutes of the right questions: does the data required to run this system actually exist in usable form? That question, asked early, prevents months of scope creep.

If the data isn't ready, the scope needs to reflect the data readiness work. That's a different project with different timelines and different skills required.

Scope the discovery, not the delivery

Sometimes you do the work and the scope still shifts. AI systems behave differently in production. Edge cases surface. The business decides it wanted something else after seeing the first demo. Scoping should be iterative because the work itself is iterative.

The structure that works: two phases. First, a fixed discovery phase of four to six weeks to answer the five questions above and validate the data. Then, a delivery phase scoped after discovery closes. You quote the discovery. You don't quote delivery until you know what you're building.

Companies that want a number before discovery closes are asking you to absorb risk for decisions they haven't made yet. Sometimes that's acceptable, with the right contingency baked in. But both parties should understand that's what's happening.

Writing the number down before discovery doesn't make the scope real. It just creates an expectation you'll spend the next six months managing.

Colin Gillingham

Need a Fractional Head of AI?

I help companies build an AI operating system — shared context across teams, AI handling the repetitive work, and your people focused on what actually matters.

15+

Years in Tech

12+

AI Products Shipped

3

Fortune 500 Brands