What to ask your users after they sign up
Most founders skip the post-signup survey. Not because they don't care what users think -- they usually care quite a lot -- but because it feels like the wrong moment. The person just arrived. Asking them questions before they've touched anything seems pushy.
The instinct makes sense. It also means missing a feedback window that only exists for a few days.
Right after signup, a user has a clear picture in their head of what your product is going to do for them. That picture was built from your marketing, your website, what someone told them, whatever they read before they hit the button. It exists before experience has had a chance to revise it.
Within a few days, it's gone. They've used the product. They've formed opinions. What they hoped for has merged with what they actually got, and those two things become very hard to separate. You stop getting expectation and start getting assessment -- which is useful, but it's not the same thing, and there's no way to go back and collect the earlier version.
The questions that work best at this stage are about context rather than the product itself. They haven't really used it yet, so asking about the onboarding or specific features doesn't make sense. Asking what brought them here, what they're trying to solve, what they've already tried -- those produce answers worth having.
"What are you hoping to achieve with [product name]?" is a reliable starting point, and the answers are often more revealing than expected. Users who signed up expecting something you don't actually do. A use case you assumed was niche turning up repeatedly. Homepage copy that's setting expectations the product can't meet -- which often explains a churn pattern you'd been blaming on something else.
"What were you using before, and what made you look for something different?" tells you who your actual competition is. Not just the obvious alternatives but the spreadsheets and manual workarounds people were maintaining because nothing quite fit. You won't find this in your analytics.
"What would make this an obvious win for you in the first month?" tends to surface a gap between what users are hoping for and what the product has been optimised around. Sometimes that gap is small and unimportant. Sometimes it explains why activation rates haven't moved in six months.
These questions don't all need to land at the same moment. A form in the onboarding flow captures the very first impression. A follow-up a few days in, once someone has had a chance to actually use the thing, catches what happened when expectation met reality. Together they give you a much clearer picture of who showed up and what the early experience actually looked like from their side.
Post-signup responses also compound in a way other feedback doesn't. Early on you read them one by one and occasionally find something that genuinely shifts how you think about the product. As the number grows, patterns start appearing -- the same job titles, certain use cases clustering, the same phrases people reach for when they describe what they were trying to solve. A few hundred responses in, the picture of who your users actually are often looks noticeably different from who you assumed you were building for.
Reading manually works while the numbers are small. At some point you need something that can tell you what keeps coming up without you categorising everything by hand. That's what Qria does with the open text -- reads through all of it and pulls out the recurring themes, the phrases that keep appearing, the places where expectations and reality are consistently out of sync. You still read individual responses; some of the most useful things come from the outliers. But you're not sifting through several hundred answers looking for the pattern.
Most SaaS products have decent data on what users do once they're inside. The thing that's often missing is a clear picture of why they showed up in the first place. That turns out to matter quite a bit.