User surveys – Best practices

Quant research method

About User Surveys

Surveys are there to collect user thoughts, experiences and opinions, what they do not capture is behavioral data. It is a quantitative method, but remember that adding open ended question can add qualitative flavour to your data.
The tricky bit about surveys is that its easy to create one that is useless and gives us incorrect data. Need to be careful and make sure data is accurate, that is correct. Never use surveys as the only research on your project – you will be missing out massively.
First of all, data needs to be actionable – an exact information that you need in order to improve product service or process.
Additionally the higher the response – the more data we have to work with.

What can you learn from a survey

  • Illuminate improvement opportunities
  • Evaluate user satisfaction
  • Better understand your audience (language, pain points, prioritize)
  • Gauge interest
  • Prioritize feature list
  • Learn how users describe your organization value
  • Monitor overall satisfaction with your product/service

When to conduct user survey

  • At the start of any new project
  • When you need to identify or prioritize service offering
  • To determinate satisfaction
  • When making decisions about messaging or user priorities

Elements of bad survey

  • Too many questions
  • Complex answers that are hard to understand
  • Answer choices that don’t line up with how users think
  • Questions that require long complex answers
  • Be brief, be clear
  • Questions full of absolutes “every,” “always,” “all,” “Do you always eat breakfast?”
  • Unbalanced answers – giving mostly negative or positive answers

Good survey

  • Ask one thing at the time
  • Simple, familiar word, no jargon
  • Stay away from 1-10 rating, instead use
    • Binary answer
    • Ranking specific features/factors
  • Avoid ambiguous meanings
  • Don’t ask leading questions
  • Have at least 1 open ended question – ask it up front – to learn user language – before its tinted by reading questions
  • The less questions the better, more than 10 questions drastically reduces participation and completion

Constructing a survey

  • Define clear, attainable goal for the survey
  • List objectives – what you want to learn and what will you do with this information
  • Make users self identify
  • Create 1 survey for multiple audiences
  • Segment your data
  • Analyze the results at each group
  • Compare and contrast
  • Make sure the questions are written in neutral way, not giving away your expectations
  • Add answers like “not applicable” and “don’t use” – to avoid skipping questions, fake answers or people quitting.
  • If you do not plan to act on data – don’t ask about it
  • Realistically estimate time required to fill your survey
  • Use simple, one-line directions, and keep them to the left side of the screen
  • Remember that the order of answers matters, first and last answers attract more attention, people tend to choose first answer that sounds like it might be it

Test your survey

  • Draft questions and get feedback internally within the team
  • Ask for feedback and comments after every question
  • Revise your questions towards clarity and usefulness
  • Test on one responder from target audience
  • Review survey making sure that answers or the order of questions doesn’t affect overall balance
  • Remember that some people will quit before finishing all questions – make sure to get important questions within first part of survey, also make sure the system you’re using capture and record unfinished surveys as well
  • Test again – this time on small group from target audience, again with comments on each page
  • Examine the output to make sure gathered data is usable

Who to survey

Existing users

  • Learn how they use product, process
  • What features (in their opinion) are most important
  • User satisfaction
  • What is the value people see in product
  • Language people use talking about your product

General audience

  • The purest data you can get – people who are not familiar with your brand/product
  • Learn how people describe your value
  • Reactions to messaging
  • Identify market gap and opportunities

Targeted audience

  • Useful for specific industries
  • Learn how people describe your value
  • Landscape of products
  • Reaction to messaging
  • Identify market gap and opportunities
  • Longitudinal comparison – how is your product/service look like amongst specialized competition

Incentive or not?

Do not use incentive. Giving people ‘reward’ corrupts the data, because it changes the motivation behind answers. People tend to rush through in order to get the reward.

Software to use

  • Survey Monkey
  • Alchemer (formerly SurveyGizmo)

Both of these have free plans to test it out as well as advanced analytical mechanisms.

Expected response rate

Depending on size of audience, standard is 10-15%.
Well targeted, properly prepared surveys can reach 25-70% response rate.

Best time to send surveys

Tuesday – first email – introduction and invitation
Then follow up on Thursday – first reminder
Then next Tuesday – second reminder
11 am and 2 pm are best time for sending
For specific industries these might vary.

Response time

Allow 1-2 weeks to collect responses, although, most engagement is usually right after the first email.

Effective email title

  • Include “Invitation:” or “Reminder:”
  • Mention estimated time of completion survey (or number of questions)
  • Include topic of the survey

Sample titles

  • Invitation: Tell us how you feel in 2 minutes.
  • Invitation: Happier students & parents in 4 minutes.
  • Reminder: Maximize student potential, a 3 minutes survey.
  • Reminder: 8 questions for parents with special needs children.

Exit surveys

‘Have you found what you were looking for?’ – quite a popular one, however the most important part is the following question:
‘What were you looking for’
we can learn a lot from this one


Take your data with grain of salt – remember, that  this method doesn’t represent the whole target audience. These are opinions of responders – very often users opinions do not reflect their actions

When presenting the results use charts, graphs, tables – make it visual with summary explaining learning

Image file format (png/jpg) works better that linked or embedded data (broken links, faulty connection)

About The Author

Bart Nowak

UX Designer with Digital Design / User Interface and print experience as well as Front-End development. Recently focusing on UX research, Design Systems and the process of creating a product. From product idea, service design, business modes, User Experience, UI design all the way to dev implementation.

Add a comment

*Please complete all fields correctly