Surveys are everywhere. Millions go out every day, to people responding as customers, employees, citizens, members, attendees, and every other capacity. Seismic decisions rest on survey data; million-dollar sums hang in the balance. So much effort is put towards driving up response rates, and yet so little attention is paid to getting higher quality data.
What are the obstacles to improving the quality of survey data? And how can UX help to solve these problems?
Don’t forget the respondents
Perhaps the most important element in running a successful survey is also the most overlooked: the cognitive stress load the survey places on respondents.
While much is said about proper methods of sampling, delivery, and follow-up, minimizing respondents’ cognitive stress with well-written, well-designed surveys is in fact one of the most effective ways to improve data quality and raise response rates.
Cognitive stress is a term that refers to the level of internal apprehension and anxiety in users of a system (or in this case, respondents to a survey). High cognitive stress among survey respondents undermines the data by clouding it with inaccurate responses made in confusion and error. If that stress load gets too high, respondents are likely to drop out and leave the researcher with no data at all.
So what survey design mistakes are responsible for creating cognitive stress? While they manifest in a variety of ways, mistakes generally fall into one of 3 broad categories:
1. Fatigue: Respondents become mentally exhausted due to length, repetitiveness, or overwhelming complexity.
2. Answerability: Respondents are unable to provide accurate responses due to insufficient knowledge or recall, unsatisfactory multiple choice options, or an unwillingness to provide an honest answer.
3. Clarity: Respondents misinterpret or fail to understand the questions, or are misled by poor word choice and question phrasing.
All 3 of these kinds of mistakes make the survey a more frustrating experience for the respondent and lead directly to lower quality data full of inaccuracies. That’s where user experience thinking comes in handy.
Injecting user-centric thinking
Typically, UX solutions are applied to website and app design, but surveys are just as much a product with a user experience that can be optimized.
Usability testing, so prominent in web and app design, allows designers to see how real people interact with their product and where they become confused, frustrated, or tempted to give up. Applying such research methods to survey design grants the researcher real insights into design flaws before the survey ever goes out, so the spots generating the most cognitive stress can be targeted for rethinking and rewriting.
With the ability to crowdsource user testing for easy and scalable research, such a solution is easily applied, and a parallel development in the UX field – the rise of quantitative usability metrics – offers opportunities for getting an even more thorough, systematic understanding of cognitive stress in survey design.
The Survey Respondent Score (SRS)
The Survey Respondent Score has been jointly developed by QuestionPro, Trymata, and MeasuringU to measure cognitive stress levels in survey respondents and paint a multi-dimensional portrait of design flaws stemming from fatigue, answerability, and clarity.
Based loosely on widely-used and respected metrics like the System Usability Scale, the SRS is the long-awaited key to understanding the survey field’s most overlooked issue.
To learn more about the Survey Respondent Score and the integration of our usability testing tools into QuestionPro’s survey design process, watch the full SRS webinar recording with QuestionPro and Jeff Sauro of MeasuringU.