There are two types of branching logic: answer based branching and demographic branching. Answer based branching is used to present questions to participants based on their response to certain questions; demographic branching presents certain questions to participants based on whether they belong to a particular demographic.
It is important to be mindful of where branching logic will be most useful. We generally recommend sticking to a common set of questions for all survey takers in order to find patterns that exist across all groups. But we also recognize that there will be reasons to branch a distinct set of questions to some groups and not others. Here are some things to consider when deciding to apply branching logic to your next survey.
Avoid these two pitfalls
The top pitfall involves making assumptions about the differences across demographic groups, and directing questions to one group, while excluding others, on topics that are relevant to everyone. In deciding that a set of questions only apply to a select group, what assumptions are you making? How might you be tapping into stereotypes that are not truly representative of the group?
A good example of this might be surveying Millennials, but not surveying other generational groups, on their learning and development needs in order to inform the design of a leadership development program. This can have several negative consequences.
Inhibits your company from benefiting from the broad range of skills and capabilities in your organization beyond one demographic group, e.g. limiting the pipeline of future leaders to Millennials.
Prevents you from delivering programs that are applicable to all groups. What might you be missing if you survey only one demographic group and then apply findings to the broader organization?
Encourages a divisive ‘us vs them’ culture, where one demographic group is set up to gain something, for example the opportunity to develop as future leaders of the organization, while other groups are set up to lose out on this opportunity from the beginning. This hurts efforts related to nurturing a collaborative environment required to get work done effectively.
Demotivates groups who are excluded by sending a message that the company undervalues their needs and interests. In the above example, the company communicates that it will not invest in their professional development and careers. Since development is what motivates and drives commitment among people to stay with an organization, we would expect engagement to drop and turnover to rise among those excluded from these kinds of opportunities.
Raises questions and may confuse the included group. They may wonder, “What makes me different that I have to have my own survey? Why would I not benefit from training programs already in place?”
While these outcomes are not intentional, they can be easy to overlook in an effort to complete the survey design process and get it out the door.
In cases such as these, a better approach would be to ask all groups for feedback, and use report filters to understand how responses vary group to group. From there, it is good practice to invite face to face conversations to understand individual stories behind the numbers.
The second pitfall involves survey scope creep. This is when your leaders and stakeholders become overzealous about adding questions, resulting in an unwieldy and unfocused survey. In addition to engagement, you may get requests to include questions on multiple topics, e.g. new benefits offerings and the effectiveness of various training programs. Also values. Performance, too. And so on.
That your leaders want to add survey questions for feedback is a good sign of engagement and involvement, and should be encouraged. But when you end up with a survey that attempts to cover too many topics, it can be difficult to communicate the purpose of the survey and to align and prioritize efforts on a few key areas. Help channel your leaders’ enthusiasm in the right direction by advising them to limit their questions to no more than roughly 12 topics and 60 questions. Alternatively, consider running a separate survey.
Good uses of branching logic
The best use of branching is to drive awareness, action and alignment at both the organizational level and the local level, for example business unit, location or department. Include a few additional questions for that demographic, while also keeping your survey to 60 questions or less in order to avoid survey scope creep.
An example of this might be customizing and directing questions to people in different business units on initiatives that teams are driving locally. This approach has the following benefits.
- Local units gain relevant, context-specific examples to guide efforts and build momentum on initiatives that they have already set into motion.
- Encourages a culture of learning and active participation by connecting local initiatives with broader survey and feedback efforts.
- Gives people the opportunity to contribute to decisions that affect their day to day experience.
- Gives you insight into opportunities for expanding local efforts into organization-wide practices.
Another good use of branching is to follow up on demographic differences that emerged from your last survey. For example, if the results of your last survey revealed a significant difference in experiences between retail sales associates and management, where scores were highly unfavorable for the former and highly favorable for the latter, you may wish to apply branching logic in your next survey to gain deeper insight into the unfavorable group's challenges.
Be mindful of the assumptions you may be making with branching logic. Be sure you have data backing up your assumptions before you target a demographic group for deeper insights.
Keep your surveys focused by ensuring your overall survey length, i.e. common questions and branched questions, do not exceed roughly 12 topics and 60 questions.
Build momentum for local efforts by using branching logic to communicate and gather feedback on business unit, location or department-based initiatives.