W1
Week One Labs
5/11/2026

User Interview Questions for MVP Validation: The 2026 Playbook

The exact user interview questions, structure, and rules that produce real signal for MVP validation. Includes a free interview question generator and recruiting playbook.

User Interview Questions for MVP Validation: The 2026 Playbook

Most user interviews founders run produce noise. Polite users say polite things, founders hear what they want to hear, and 30 days later the team is building a feature nobody asked for. The questions are wrong, the framing is wrong, and the founder is doing 80% of the talking.

User interviews done well are the highest-leverage activity in pre-product-market-fit. Done badly, they actively make decisions worse by manufacturing false confidence. The difference comes down to a small set of habits and a question structure that grounds every answer in past behavior instead of future intent.

I built a free user interview question generator that produces a custom script in 30 seconds based on your stage, audience, and primary focus. This post is the playbook behind it.

The single rule that fixes most bad interviews

Every question must be grounded in past behavior, not future intent. That is the rule. Internalize it and 80% of common interview mistakes go away.

Bad: "Would you use a product that does X?" Good: "Tell me about the last time you tried to solve this problem. Walk me through what happened."

Bad: "How much would you pay for Y?" Good: "What are you spending today, in money or time, to handle this?"

Bad: "How important would this feature be to you?" Good: "When was the last time you needed this and what did you do?"

Future intent is theater. People answer based on what they wish were true, what makes them sound smart, or what they think you want to hear. Past behavior is data. They cannot lie about what they already did because the evidence exists in their calendar, their bank account, and their Slack history.

This is the core insight of Rob Fitzpatrick's The Mom Test. If you only read one book on user research, read that one. It is short, sharp, and will make your interviews materially better within a week.

The script structure that produces signal

Every interview has the same skeleton regardless of stage. Roughly 30 to 45 minutes, structured in five sections.

The first 5 minutes are warm-up and context. Get them talking about themselves, their role, and the world they live in. Skip the small talk after a minute or two. The goal is to learn what their week actually looks like and where this problem sits.

The next 15 minutes are problem deep-dive. This is where you spend the most time. Ground every question in a specific past instance. "When was the last time this happened? Walk me through what you did. What was the most frustrating part. How long did it take. Who else was affected." The pattern you are listening for is emotional reaction. If they get visibly frustrated recounting the story, the pain is real. If they describe it dispassionately, it is probably not a top-five problem for them.

The next 10 minutes are current workarounds. What are they doing today to solve the problem. How much time per week does it cost. What tools have they cobbled together. What have they paid for. The workaround is the real competition. If they have built a 17-tab spreadsheet to manage the problem, that spreadsheet is what you need to beat. If they have done nothing, the problem is probably not painful enough to drive a purchase.

The next 10 minutes are solution reaction (only in solution-validation interviews, not in early discovery). Describe the proposed solution briefly. Watch for the reaction in the first 5 seconds. Genuine curiosity (leaning in, asking how it works, wanting a demo) is different from politeness (smiling, nodding, saying "interesting"). Then ask what concerns or blockers come to mind. Their concerns are gold. The polite enthusiasm is not.

The final 5 minutes are pricing, willingness to pay, and wrap-up. "What budget category would this come out of. Who would write the check. At what price would this feel like a no-brainer. At what price would you start questioning the value. Who else should I talk to about this problem." That last question is critical: it both validates that they take the conversation seriously and snowballs your sample.

Stage-specific question packages

Different stages need different question emphasis.

In problem discovery (pre-solution), spend almost the entire interview in problem space and workarounds. Skip pricing entirely. Skip showing your solution. The point is to learn whether the problem is real, frequent, and painful enough that someone is already paying to solve it.

In solution validation, allocate roughly 60 percent of the interview to problem and workarounds, 30 percent to solution reaction, 10 percent to pricing. The solution reaction questions should follow the problem framing tightly: "When I describe a tool that does X, what is your first reaction. What concerns. Who else would need to approve."

In pre-launch pricing, lean heavily into pricing and willingness-to-pay questions, anchored by their current spend. The Van Westendorp four-question method works well here: at what price would this be a bargain, at what price would you start thinking it is getting expensive, at what price would it be too expensive to consider, at what price would you doubt the quality?

In churn and retention, the structure flips. Most time is spent on the moment of decision to stop using the product, what was missing or broken, and what would have had to change. Avoid asking about future re-adoption intent. Focus on the past experience.

Recruiting users is half the work

The interviews only matter if you are talking to your actual ICP. Five recruiting channels that work in 2026, in order of typical conversion:

Existing network with warm intros has the highest conversion. Mine your past colleagues, customers, and connections. Ask for 30 minutes specifically, and explicitly note you are not selling anything.

LinkedIn outreach with a personal note targeting people who fit your ICP. Skip the generic templates. Reference something specific about their role or company. Conversion is 5 to 15 percent for a good note, almost zero for a bad one.

Communities and Slack groups where your target customers hang out. The rule: contribute first for at least two weeks before asking for interviews. Communities punish drive-by recruiting.

Twitter or LinkedIn posts asking for 30-minute conversations. Best for consumer products and developers. Tag relevant people, offer something in return (a summary, early access, a charitable donation).

Paid recruiting services like UserInterviews.com or Respondent.io. Typically $40 to $150 per session. Worth the cost when you need very specific personas and your network does not have them.

Always offer a small thank-you for non-customer interviews. A $25 Amazon gift card, a charity donation, or a copy of a relevant book is sufficient. For very senior or specialized respondents, $75 to $150 is appropriate.

How many interviews is enough

The honest answer: stop when you can predict what the next person will say. For early problem discovery, that is usually around 10 to 15 conversations in a tight segment. For solution validation, another 10 to 15 with similar profiles.

If the 12th person tells you something the first 11 did not, keep going. If everyone is repeating the same pain in the same words, you have signal and the marginal interview is no longer adding insight. Move to building.

For very specific niches (regulated industries, narrow personas), saturation can come at 6 to 8 interviews. For broad markets, you may need 25 to 30 to feel confident in the pattern.

The recording and notes process

Always record with explicit permission. Use Zoom, Riverside, or Otter. Auto-transcription makes the analysis dramatically faster.

Take light notes during the interview to capture follow-up threads and emotional moments. Do not try to write a verbatim transcript while listening, you will miss the conversation. Write up real notes within two hours of finishing while it is fresh.

Tag insights with three labels: problem signal, workaround signal, willingness-to-pay signal. After every 5 interviews, scan across notes and look for repeated phrases. Repeated language is gold for marketing copy, sales decks, and pitch material later. The exact words customers use to describe the problem are almost always better than the words you come up with.

The traps to avoid

Three predictable mistakes that ruin most founder interviews.

The pitch trap: spending the first 10 minutes describing your product and asking what they think. They will be polite. You will hear nothing. Save the product description for the last 10 minutes, after you have built a picture of their problem space.

The feature-list trap: asking "would you want feature A, B, C, or D?" They will say yes to all of them because saying yes is easy and free. Ask instead what their biggest frustration is and let them rank for themselves.

The leading-question trap: phrasing questions to confirm your hypothesis. "Don't you think it would be amazing if..." gets you the answer you wanted, not the answer that is true. Strip every adjective out of your questions. Just ask "tell me about the last time."

Where to start this week

Open the user interview question generator. Pick your stage and audience, get a script in 30 seconds. Use it for your next 10 conversations.

If you are about to start interviewing for an MVP, pair it with the product-market fit score to track your validation progress, and the MVP feature prioritizer to translate what you hear into actual scope.

The cheapest leverage in pre-product-market-fit is talking to 20 real users in the right way. Most founders skip it or do it badly. The ones who do it well ship features that get used and avoid the year of building things nobody wanted.

Stay ahead on AI.

I build with AI every day. I will send you what is worth knowing and what is not worth your time.

Free tools from Week One Labs

Estimate your build cost, timeline, and whether to build or buy - before you commit.