We spend a lot of time at Culture Amp (internally and with other people geeks) writing, editing and testing out survey questions. It is often said that writing good survey questions is both an art and a science and perhaps that is true. However, before we spend hours writing those perfect questions it can be useful to step back and think about some broader issues pertaining to our goals and how these might guide us in crafting our questions. Here are some of our thoughts on Survey Design.
1. Survey length
Most people are going to get tired of answering your questions after about 10 minutes and they're not going to believe you ever again if you say it will take 10 minutes and it actually takes them 20 minutes (if they bother completing at all). Additionally, the more questions you ask the less time people will take answering and thinking through their responses (it's called satisficing).
Implications: Although you might want answers to lots of questions to get the detail and coverage you think you need, longer surveys are going to decrease the quality of your data and your response rates. It won't matter how awesome your questions are if half your employees don't even complete them or barely read them. Our data suggest surveys taking longer than 10 minutes will begin to show quickly rising abandonment rates. Because most people will be able to answer around 50-60 rating type questions in 10 minutes (if the questions are good and the format is kept simple) we suggest that is a good guide for a maximum number of questions. This will also help you workout how many questions you can ask on each topic you want to cover. For example, 10 topics/areas means about 5-6 questions for each one.
2. Survey UX
These days most employees have been exposed to hundreds of online applications that have been designed to minimize cognitive load and be quickly 'usable'. We think this has only made survey usability (UX) even more important than it ever has been (it's always been important). Survey UX will improve response rates and allow more cognitive effort to be directed at the content of the survey rather than rethinking what is going on or where or what to click.
Implications: Use a response format that doesn't require people to make too many decisions that are irrelevant. Research has shown that for rating scales 5-7 response options is sufficient for good reliability and validity and using fewer response options is going to make responding a lot easier and faster (especially on a phone). The other thing that will slow people down and get them fidgety is switching response formats, making them rank too many options, and anything else that will stop the flow of their experience. We use a 5 point agreement scale as much as possible for this reason.
3. Survey outcomes and focus
Most surveys should have a primary focus or outcome in mind even if it may be hard to pin down in some cases. You should ask yourself what is the overall end point or outcome you want to tap into and are aiming to improve. You can then try to identify aspects or indications of it that your respondents will be able to see, feel or know about.
For example, in an employee engagement survey our primary outcome is the level of engagement employee's feel with the company and we use questions questions asking if they are motivated to do great work, would recommend the company, and if they intend to hang around for a while. The important thing is that you feel these are good overall indicators that things are going well. Importantly, these questions often tap into feelings, thoughts or behaviors that are hard or impossible to act on directly because they represent the outcomes of many other things. This does not matter because we can use other questions and analytic tools to tell us what things are most likely to impact on them.
Implications: Try to clearly articulate what you think is the highest level outcome you are trying to get to or improve via your survey. Then try to come up with a range of statements that represent, or would be associated with, that purpose or goal being achieved. You can then assess which of these you think are best and which ones your respondents will likely be able to answer or provide feedback on. From there all the usual key rules apply:
- Statement should be fairly simple and avoid grammatical negatives as much as possible (e.g. not, do not, un-)
- While it should be simple it should also be believable (e.g. avoid using extreme language such as always, never, and best unless it is really appropriate)
- Try to avoid statements where people could often agree with one part but disagree with another aspect (e.g. I am happy and active at work)
- Try to ask things that people will know about (e.g. it's easier for someone to know if they feel something is 'fair' versus if it is 'accurate')
Once you've arrived at your 4-5 outcome questions they can then form a group together known as an index or factor and together they will represent the key outcome of your survey. Factors provide greater score variation amongst respondents and tend to have better statistical properties then single questions.
4. Survey coverage
After you've arrived at your outcome questions it's time to make a list of all the different types of things you think might possibly impact your outcome. Start off with general categories. For employee engagement you might consider working conditions, leadership, recognition & reward, managerial practices, career opportunities, teamwork etc. From there you'll get a feeling for how many questions you will have room for in each section on average. You'll then also see that if you want to ask 10 questions on career development, for example, that you'll then have fewer questions you can add somewhere else. You'll also notice that as you get more specific that you'll need more questions to cover that area. Here is where you'll experience the tension between wanting detailed information but not wanting 100 questions either.
Implications: If you want to keep your survey length to some set length then you will need to make some compromises. When you have to decide what to take out focus on anything you think is not truly related to the outcome or things that you won't be able to really change. Your aim should be coverage of every area that could be a candidate for impacting your outcome. The questions do not need to perfectly capture every aspect of each area but rather they should be sufficient that if an area was a problem then the questions would pick this up in people's perceptions. You'll almost never be able to write questions that will tell you exactly what should be done.
5. Survey as conversation starter not conversation ender
If you accept the need and advantages of a shorter survey you'll quickly find that you have to sample each area with just a few questions to get the coverage you need. From there you'll find that your survey results will often direct you to the most important areas that are related to your outcomes (via driver analysis for example). But just what to do about it or what are the most important reasons for why people responded the way they did will require a discussion with people. In fact even with very very specific questions this will usually still be the case. Imagine scoring low on a question such as: 'I receive adequate recognition for my daily code commits'. You've narrowed it down but you'll still need to understand what sort of recognition is preferred and you still won't know if it will impact your outcome.
Implications: Don't get hung up on writing hundreds of perfect and specifically actionable questions and don't get hung up on thinking that your results must tell you the precise and perfect thing to do to remedy a situation. This is impossible. Write questions that will cover off the major areas and then discuss the results to hone in on ideas for the specifics that may impact or improve perceptions in these areas. Your results can direct you to the most important areas to discuss and focus on and help you avoid focusing on things that one or two louder voices might complain about - that is the true value. You can also follow up with a survey specifically focused on the areas identified that cover those in more detail - don't try to do it all in one gigantic survey.
To sum it up. The best approach is to aim for good questions that form part of an iterative approach that will involve follow up discussions and collecting ongoing data. This is central to all things agile - and surveys are no exception. The days of the behemoth annual survey are thankfully gone.