Recently a friend asked me to help him gather some customer input for a new store that he’s building. I suggested that he put together a short (10 question) survey to find out how his target demographic reacts to some of the branding ideas that he’s thrown around. I’ve used this tool successfully with larger corporate projects but the results are useful regardless of the size of the business. After taking a look at his initial question set I thought that I’d write up a quick case study that could act as a “how to” based on my experience and general knowledge of behavior in UX.
Luckily, I had just started doing my own research on an a psychology related app idea. I sent out a survey to gauge interest on wether that idea has an audience and what that audience may expect from the application.
SurveyMonkey is my go to tool for this initial round of data collection. I’m trying to get an idea as to whether this idea is worth investing more effort into so I don’t want to invest much time or money. A free account gives me up to 10 questions and 100 participants per survey. That’s a good amount of feedback and also forces me to break questions into blocks of importance. If the initial 10 questions point to this project being worth moving forward then I can assemble another 10 question set with a higher level of fidelity.
The act of putting the survey together is itself a planning tool. If I can come up with clear questions then the answers to those questions can define clear product goals. If I cannot come up with clear questions then I either need some assistance or perhaps there’s no need for the product beyond my desire to make it. This leads to the first issue that most survey makers have; what constitutes “good survey questions”?
1. Get the basics first
I usually put the basic demographic collection questions right at the front. Age, gender, income, education, etc are all data points that we’ll need sooner rather then later. These also serve as “warm up” questions for people filling out the survey. They build confidence, comfort, and a feeling of investment the further they get into the survey so some “no brainer” questions at the start pulls folks right in.
2. Balance precision and ambiguity
Remember that the purpose of the survey is to collect data that is relevant to a theory, not prove a theory. I don’t want questions to be so specific that they begin to make conclusions for the audience. At the same time I don’t want questions that are so open ended that people have issues answering them accurately within a single sentence or selection.
3. Avoid asking questions that you already know the answer to
Questions like “Do you enjoy having fun?” don’t really do much for me. In fact, they usually make me look silly in the eyes of the participant and can cause people to bail out before finishing the survey. I ask myself if each question can stand on its own.
4. Keep multiple choice selections short and/or entertaining
The faster people understand what they are reading, the more accurately they will answer questions. Say there’s a question with four possible answers listed underneath. If choice one is too long the participant won’t make it to choice two. Some people literally select the first item that causes them fatigue in order to relieve the mental discomfort. I’m better off coming up with concise answer choices but if they have to be long then make I’ll try to them enjoyable by inserting some humor if appropriate.
5. Save the best for last
Ultimately I want to know if this person is interested in my product, idea, etc. Remember that people are not the most logical animals and will often answer a question because they feel a certain way at the moment instead of think it through. My initial questions should warm participants up a bit by making them think about the subject. This gets them into a mindful state before I drop the bomb at the end.