Product Building
How to Conduct User Research for SaaS Product Design

I've built software for 25 years, and here's what I've learned: most user research is theater. Teams run focus groups, send surveys, and compile reports that no one reads. Then they build exactly what they planned to build anyway.

This article is part of our complete guide to SaaS product design.

Product designer conducting user research interview in bright modern office with natural lighting

At Dazlab.digital, we've shipped niche SaaS products across industries — from interior design tools to HR tech. The products that succeed? They're the ones where we actually talked to users. Not in conference rooms with two-way mirrors, but in their actual work environments, watching them struggle with the problems we're trying to solve.

Let me show you how we do user research that actually shapes products, not just validates predetermined ideas.

Why Most SaaS User Research Methods Fail

Here's the dirty secret about user research in SaaS: most of it happens too late. Teams spend months building features based on assumptions, then do "research" to confirm they were right. When users say the feature doesn't solve their problem, teams tweak the UI instead of questioning the core assumption.

I've watched this pattern destroy products. A real estate software company I worked with spent six months building an "AI-powered property matching" feature. They did user research after launch. Turns out agents didn't want AI recommendations — they wanted a faster way to input property details. Six months and hundreds of thousands of dollars wasted because no one asked the right questions early.

The other failure mode? Research paralysis. Teams get so caught up in methodology — should we do interviews or surveys? How many users constitute a valid sample? — that they never actually talk to anyone. Meanwhile, competitors who just picked up the phone and called five customers are already shipping solutions.

Start With the Work, Not the Worker

When we built TidyTime for interior designers, I didn't start by asking them what features they wanted. I sat in their offices and watched them work. One designer had three monitors: one for her project management software, one for invoicing, and one for client emails. She was copying project details between all three systems, spending two hours a day just keeping everything synchronized.

Close-up of person's hands working on laptop with notebook and coffee in natural morning light
That observation shaped our entire product strategy. We didn't build another project management tool with better features. We built integrations that eliminated the copying and pasting. The insight came from watching the work, not asking about it.

This approach — contextual observation — beats surveys every time. People can't articulate problems they've gotten used to. That designer never would have said "I need better integration between my tools." She'd accepted the manual work as part of her job. But watching her do it revealed the real opportunity.

Here's how we structure these observations: spend a full day with someone doing their actual job. Don't interrupt. Don't suggest improvements. Just watch and take notes. Pay special attention to workarounds — the Excel sheets they keep because the software doesn't do what they need, the sticky notes covering their monitor with reminders the system should handle.

The 10-Minute Rule for Getting Real Feedback

Traditional user interviews fail because they're too formal. Put someone in a conference room with a recorder running, and they'll tell you what they think you want to hear. They'll be polite about your terrible ideas and enthusiastic about features they'll never use.

We use what I call the 10-minute rule. When we need feedback on a specific feature or workflow, we don't schedule hour-long sessions. We call users and say, "Got 10 minutes to look at something?" The informality changes everything. People give you their actual reaction, not their prepared thoughts.

During these calls, we share our screen and show them exactly one thing. Not a full demo. Not a slide deck. One workflow, one screen, one specific problem we're solving. Then we ask: "Would this actually help you?" The constraint forces focus. In 10 minutes, there's no time for politeness or feature requests that don't matter.

I learned this building HR tech. We had a complex applicant tracking system with dozens of features. When we did traditional hour-long demos, recruiters said everything looked great. When we switched to 10-minute calls focused on single workflows, we got the truth: most of our features were overcomplicated for their actual needs.

The best user feedback comes when people don't realize they're giving feedback. They think they're just having a conversation about their work.

Build to Learn, Not to Launch

Here's where most teams get user research for SaaS wrong: they think it ends when development starts. At Dazlab.digital, our most valuable research happens after we ship something.

We practice what I call "learning launches" — shipping minimal versions not to get users, but to get reactions. When we were exploring real estate association software, we didn't spend months building a full platform. We built a simple member directory tool and gave it to three associations for free. The usage patterns taught us more than any interview could have.

Overhead view of team collaborating around laptop with notes and materials spread on table

One association barely touched the directory but used our basic event registration feature constantly. Another ignored events but had members updating their profiles daily. Same software, completely different needs. If we'd just done interviews, both would have said they needed "member management." Watching actual usage revealed the nuances.

This approach requires killing your ego. Most of what you build in these learning launches will be wrong. We've thrown away entire codebases after learning launches revealed we'd misunderstood the problem. But throwing away two months of work beats launching a product no one needs after a year of development.

The key is instrumenting everything. Every click, every feature ignored, every workflow abandoned halfway through — it's all data. We use simple analytics tools, nothing fancy. The goal isn't to track vanity metrics but to understand where users get stuck.

Skip the Surveys, Pick Up the Phone

Surveys are the empty calories of user research. They make you feel like you're learning something, but the insights are usually worthless. People don't fill them out honestly. The people who do fill them out aren't representative. And multiple-choice questions can't capture the complexity of real workflows.

When we need quantitative data, we look at usage patterns, not survey responses. How many times did someone start creating an invoice but not finish? That tells us more than asking "How satisfied are you with our invoicing feature?" on a scale of 1-10.

For qualitative insights, nothing beats picking up the phone. Not scheduling a call. Not sending a calendar invite. Actually calling someone right now and asking one specific question about one specific problem. The spontaneity gets you better answers.

I call this guerrilla research. When we're designing a new feature, I'll call five users that morning and ask them about it. By lunch, we know if we're on the right track. No methodology. No research plan. Just conversations with people who actually use the product.

This only works if you've built relationships with your users. They need to trust that you're calling because you genuinely want their input, not because you're trying to upsell them. We maintain these relationships by actually acting on feedback. When someone suggests an improvement and we ship it two weeks later, they become a research partner for life.

The Brutal Truth About Feature Requests

Every SaaS product gets buried in feature requests. Users want integrations with every tool ever made. They want customization options for edge cases that affect three people. They want you to rebuild Salesforce inside your simple invoicing tool.

Most of these requests are noise. But buried in the noise are signals about real problems. The skill is distinguishing between the two. When someone asks for a feature, we don't ask "What would this feature do?" We ask "What are you trying to accomplish?"

An interior designer once asked us for "advanced reporting capabilities" in TidyTime. Instead of building a reporting module, we asked what reports she needed. Turns out she was manually creating a weekly summary for her accountant showing billable hours by project. We built a one-click export that created exactly that summary. Took two days instead of two months.

The pattern repeats: users ask for features when what they really want is to solve a specific problem. Your job in user research isn't to collect feature requests. It's to understand the problems behind them.

We maintain a document called "Problems, Not Features" where we translate every request into the underlying problem. "I need Zapier integration" becomes "I need to get invoice data into QuickBooks automatically." This translation reveals patterns. When five users ask for different features that solve the same underlying problem, we know where to focus.

When to Ignore Your Users Completely

This might sound contradictory after everything I've said, but sometimes you need to ignore user feedback entirely. Users are experts at their problems but terrible at imagining solutions. If we'd listened to interior designers, we would have built a faster version of their existing tools. Instead, we reimagined the entire workflow.

The key is knowing when to listen and when to lead. Listen religiously when users describe their problems, their workflows, their frustrations. Ignore them when they prescribe solutions. The famous quote about faster horses applies here — users would have asked us for better project management features when what they really needed was to eliminate project management busywork entirely.

We've found that breakthrough features come from deeply understanding user problems, then solving them in ways users never imagined. When we introduced real-time billing updates in our agency software, users initially resisted. They were used to batch invoicing at month end. Six months later, those same users said they couldn't imagine working the old way.

Innovation requires conviction. But conviction without user understanding is just arrogance. The balance? Understand problems obsessively, then trust your instincts on solutions.

Making User Research Actually Happen

Here's the thing about user research: everyone agrees it's important, but it's the first thing cut when deadlines loom. We've solved this by making research part of our development rhythm, not a separate phase.

Every two-week sprint includes user touchpoints. Not formal research sessions — just conversations. Our developers talk directly to users about the features they're building. Our designers watch people use their prototypes. This constant contact keeps us grounded in reality.

We also practice what I call "research sprints" — one day every month where the entire team does nothing but talk to users. No building. No meetings. Just conversations about how people actually use our software. These days generate more insights than months of formal research programs.

The logistics are simple: maintain a list of engaged users who are willing to talk. Respect their time with focused questions. Share what you learned with them — close the feedback loop. And most importantly, act on what you learn. Nothing kills a research culture faster than gathering insights and ignoring them.

The Path Forward

Good user research for SaaS isn't about following methodology. It's about staying curious about the people using your software. It's about watching them work, understanding their constraints, and building solutions to real problems — not imagined ones.

At Dazlab.digital, we've built successful vertical SaaS products by staying close to our users. Not through surveys or focus groups, but through constant conversation and observation. This approach has shaped everything from our HR tech to our content management systems.

The best time to start? Right now. Pick up the phone. Call one user. Ask them about the last time your software frustrated them. Then fix that thing. It's not complicated. It just requires doing it.

Want to see these principles in action? We're always working on new vertical SaaS products at Dazlab.digital, applying these exact research methods to solve niche problems. If you're struggling with user research for your own SaaS product — or need help building something that actually solves user problems — let's talk. We've been doing this for 25 years, and we're still learning something new from users every day.

Frequently Asked Questions

How many users should I interview for SaaS product research?

Forget sample sizes. Five focused conversations with actual users beat 100 survey responses. We've validated major features by talking to just three people who deeply understand the problem. Quality of insights matters more than quantity — one interior designer showing us their three-monitor workflow taught us more than months of formal research.

When should I start user research for a new SaaS feature?

Before you write a single line of code. Watch users struggle with their current tools first. We spend days observing workflows before even sketching solutions. The biggest waste is building features based on assumptions, then trying to validate them later. Start with the problem, not your solution.

What's the best way to get honest feedback from SaaS users?

Keep it informal and focused. Our 10-minute rule works because users don't have time to overthink their responses. Call them directly, show one specific thing, ask one specific question. Formal interview settings produce polite lies. Casual conversations reveal actual problems.

Should I build exactly what users ask for?

No. Users are experts at their problems but terrible at designing solutions. When an interior designer asks for "advanced reporting," dig deeper. What report do they actually need? We translate feature requests into problem statements, then solve the core issue — often in ways users never imagined.

How do I balance user feedback with product vision?

Listen obsessively to problems, but trust your instincts on solutions. We ignored initial resistance to real-time billing updates because we understood the deeper workflow benefits. Six months later, those same resistant users couldn't imagine working without it. Understand constraints deeply, then innovate boldly.

Related Reading

Let’s Work Together

Dazlab is a Product Studio_

Our products come first. Consulting comes second. Whichever path you take, you’ll see how a small team can deliver outsized results.

Two open laptops side by side displaying a design project management interface with room details and project listings.