AI makes it easier than ever to spin up prototypes, test landing pages, and understand feedback at scale…
That’s great, but here’s the trap: if you skip real customer conversations, you’re basically throwing darts without a dartboard.
Dr. Else van der Berg put it well: “We don’t need customer interviews anymore” is a dangerous idea.
Our CEO Shannon Vettes sharpened it further: “Don’t outsource something that helps you understand your customer deeply. This is the gold in the mine.”
This isn’t theory.
It’s a reminder that AI is powerful, but it can’t replace the value of talking to your users.
If you want to build products people actually love (and not just shiny prototypes), you still need to hear their voices.
Let’s give AI its due. It’s not hype, and it really does transform customer research:
That’s why we’ve invested heavily in AI inside Usersnap. Teams today don’t have the time or patience to manually sift through 5,000 survey answers. With AI, you can surface the big patterns in seconds.
Example: An e-commerce team used AI clustering in Usersnap and quickly uncovered that 18% of customer reviews mentioned “slow delivery.” That insight was buried under hundreds of different phrasings and AI pulled it together instantly.
But here’s the catch: garbage in, garbage out.
AI is only as strong as the input you feed it. And if the inputs are shallow, biased, or incomplete, the outputs will steer you in the wrong direction …
Shannon’s rule is simple: the moment you outsource customer understanding, you lose the gold.
Why?
Because interviews, surveys, and contextual observation reveal things AI can’t:
Example: A SaaS product team killed a feature after early prototype testing flopped. But in follow-up interviews, they discovered the problem wasn’t demand, it was onboarding. Customers wanted the feature but couldn’t figure out how to use it. Without talking to users, they almost axed a valuable idea.
Skip these conversations, and your AI insights risk being precise but wrong.
AI works with the data you give it. If you only test with early adopters or a biased sample, the insights will reinforce their view, not your market’s reality.
Dashboards look impressive, but numbers alone don’t explain why people behave the way they do. 1,000 clicks might mean delight or confusion.
People behave differently in lab tests than in their daily lives. A prototype in a quiet office doesn’t capture the stress of a call center employee juggling five tools.
As Else said: the right question needs the right method, at the right stage.
AI is fast, but it doesn’t give you the full picture.
So what’s the alternative? Combine AI’s speed with human depth.
Dr. Else van der Berg breaks research into three buckets. Here’s how they look when you add an AI layer:
👉 Example: A fintech team combined methods:
Result: they not only fixed design flaws but also simplified product language, something they’d never have caught with prototypes alone.
AI doesn’t replace these methods. It amplifies them.
This is exactly why we built Usersnap the way we did:
Workflow Example: A product manager runs a micro-survey in Usersnap after launching a new feature.
AI clusters the open-text responses and highlights the top three recurring issues. The PM reviews them on Monday morning, connects the dots with interview notes, and takes action in the next sprint.
The result: speed + depth. You never lose the human context, and you don’t waste time drowning in raw data.
Don’t fall into the trap of thinking AI makes research obsolete.
Use it to scale, speed up, and sharpen but never skip the part where you actually talk to your customers.
That’s where the real gold is.
👉 Try Usersnap free and see how AI + human insight work together to drive better product decisions.
Over the past few months, we’ve heard the same story again and again through surveys,…
“Delight is not just solving a problem - it’s creating a positive emotional memory.”— Nesrine…
If your AI strategy feels like it’s solving everything except what matters, you’re not alone.…
Too many product teams spend months or even millions training AI models that never deliver…
Too many discovery efforts fail silently. Teams run interviews, ship features, and sprint ahead -…
If you're sprinting with delivery while discovery is stuck in the parking lot, you're not…