Skip to main content
← Back to Blog

The Questions to Ask Before Any AI Vendor Demo

Colin Gillingham··5 min
ai-consultingai-implementationai-strategyfractional-aiai-leadership

Every AI vendor demo looks incredible. That's the point.

The demo is not designed to inform you. It's designed to make you feel like the decision is already made. By the time they hit the "live product" screen, you're mentally calculating budget, not asking questions.

That gap — between what a demo shows and what a product actually does — is where most AI implementations go wrong. I've watched companies spend six months integrating a tool that worked perfectly in a 30-minute slide deck and failed in every other context.

The questions below aren't gotchas. They're calibration. You're trying to figure out whether the capability you just watched is real, repeatable, and suited to your actual problem.

Ask about the data first

Before anything else: "What data is this demo running on?"

Most demos run on clean, pre-processed, vendor-curated datasets. Your data is messier, inconsistently labeled, and stored across four systems no one documented. Ask them to walk through what data prep you'd need to do before going live. If they can't answer specifically, that's information.

Follow with: "Can I see this on data that looks like mine?" If they hesitate, you now know something important about the product's real-world performance envelope.

Ask what happens when it fails

Every AI system fails, and the question is how.

"Walk me through a failure case" should be a standard ask in every vendor conversation. A vendor who can answer this specifically — here's what breaks it, here's what the output looks like when it breaks, here's how you'd catch it — is a vendor who knows their product deeply.

Vague answers signal that edge cases haven't been mapped, or that they have and the vendor doesn't want to discuss them.

This connects directly to how to pick your first AI use case — the best candidates are ones where failure has a known, recoverable cost.

Ask about the human in the loop

"What decisions does your product make autonomously, and what does it surface for human review?"

AI systems exist on a spectrum from fully automated to fully advisory. Most vendors will position their product wherever sounds most impressive for your question, so the actual answer requires pressing.

The follow-up matters more: "How do I change where that line sits?" If the autonomy threshold is hardcoded, you're accepting their risk tolerance, not yours. If it's configurable, find out how much control you actually have and how much engineering it takes to exercise it.

Ask about the integration honestly

"What does a typical integration timeline look like for a company our size?"

Listen for what they skip. Vendors talk about API connections. They don't talk about the two months of data pipeline work, the security review, the change management required to get your team using the thing. Ask specifically: "What are the top reasons implementations stall at your customers?"

If you've already done an AI audit, you'll know which of these friction points apply to you. If you haven't, run the audit before the demo meeting.

Ask for a reference who struggled

Any vendor will give you a reference customer who loves them. Ask for one who had a difficult implementation.

"Can you connect me with a customer who went through a rocky rollout and got to the other side?" This isn't cynicism — it tells you how the vendor behaves when things aren't going well. That's when the relationship matters most.

If they say all their implementations go smoothly, you're not getting an honest answer.

What to do with the answers

The questions are only useful if you're actually listening, not just running through a checklist.

What you're evaluating is whether the vendor engages with the hard parts of your actual situation or tries to get you back to the demo flow. A vendor who says "that's a good edge case, we've seen it before, here's how we handle it" is a different partner than one who says "I'll have our team follow up on that."

How to evaluate an AI consultant covers a similar signal: the real tell isn't competence in the pitch — it's candor when you push.

The questions are how you find out whether the product is anything like the demo.

Colin Gillingham

Need a Fractional Head of AI?

I help companies build an AI operating system — shared context across teams, AI handling the repetitive work, and your people focused on what actually matters.

15+

Years in Tech

12+

AI Products Shipped

3

Fortune 500 Brands