Prolific Builders

I help climate tech product managers and founders go from Idea to Decarbonization.

Feb 24 • 2 min read

Bad interviews show up as failed launches


Stop collecting feedback you can't use

You talked to customers, launched the product and it didn't work.

By the time you found out, you'd already spent months and real money to get there.

This is especially painful for hardware builders.

The problem was the customer feedback you built on was full of Risky Data.

When running customer discovery, you're not looking for someone to say "Yes, I'll buy this".

You're looking for stories, workarounds, and frustration.

Here's how to run high signal customer interviews:

  • Spot 3 types of Risky Data that sound like signal but aren't
  • Don't trigger customer biases
  • Ask Bias-Blocking Questions that get you Reliable Data

Claude Skills that turn interviews into Product

Sign up for my free workshop

  • How to build 3 Claude Skills that discern pain from politeness
  • Never analyze a customer interview in a chat window again
  • Leave with your Interview Coach analyzing your own transcripts

Are you collecting Risky Data without knowing it?

You're in the interview. The customer is nodding. You feel like you have signal.

But there are three types of feedback that you should be skeptical of:

  1. Compliments: "This is great, I love it." Your customer is being polite. It tells you nothing about whether they'll change their behavior or open their wallet.
  2. Fluff: "I would use something like this."These are predictions about future behavior. People are terrible at predicting their own future behavior.
  3. Ideas: "We'd take it if you could add X." This feels like a buying signal. It's not. Customers own their problems. You own the solution.

If your notes are full of these, you don't have customer insight.

You're putting False Positives on the roadmap.

Here's why it keeps happening:

Ask a biased question, get a biased answer.

You're triggering False Positives by asking the wrong questions.

Watch out for these kinds of biases:

  • Say vs. Do Bias is triggered by: "Would you buy a product that does X?" — People say yes. Then they don't buy.
  • Social Desirability Bias is triggered by: "Do you think this is a good idea?" — Nobody wants to crush your dream to your face.
  • Optimism Bias is triggered by: "What would your dream product do?" — They describe a fantasy, not a problem they're actually dealing with today.

These questions feel natural in the moment.

That's exactly what makes them dangerous.

The Fix: Ask Bias-Blocking Questions

Reliable Data comes in the form of facts about past customer behavior, pain points and desires.

Ask Bias-Blocking Questions that anchor to the past, where real behavior lives:

  • "Tell me about the last time you dealt with [problem]."
  • "How do you handle that today? Any issues come up?"
  • "Why do you need to do that in the first place?"
  • "What happens if it doesn't get resolved?"

That answer to these questions provide you with Reliable Data you can use to build product.

The Takeaway

Bad interview questions show up as failed launches.

It happens because the questions you're asking trigger biases that make customers say what you want to hear.

The fix is Bias-Blocking Questions that anchor to past behavior, not future intentions.

Do that, and you'll know what to build before you build it—not after you launch it.

From Insight to Action

Never build alone. Use my free resources and workshops to build a real climate solution.



I help climate tech product managers and founders go from Idea to Decarbonization.


Read next ...