Experts or the crowd? It’s a debate that vexes numerous and diverse areas of thought, from sociology and psychology to government (authoritarianism or democracy?), economics (central planning or free markets?), information dissemination (Encyclopedia Britannica or Wikipedia?) and more.
So when it comes to UX, who can tell you more – the experts, or the crowd? The answer may not be as clear-cut as you think.
The wisdom of crowds
In 2004, James Surowiecki gave a name to the truth and accuracy of the aggregated many: “the wisdom of crowds.” It’s the idea, basically, that the collected knowledge of a large number of people tends to be remarkably correct.
The apple of this particular strain of thought fell on the head of a British scientist named Francis Galton, a perfectly stuffy elitist certain that proper breeding and the concentration of power in the hands of a suitable few were the key to a successful society. Observing a contest to guess the weight of a well-fattened ox at a country fair, Galton was inspired to run statistical tests on the participants’ responses, and discovered, to his surprise, that the average of all 787 responses deviated from the ox’s true weight by a single pound.
Galton’s full story, by Surowiecki
The wisdom of crowds lies in the great diversity of independent opinion: as overestimation, underestimation, opposition, endorsement, half-truths, and whole truths are averaged together, the voice of the crowd converges on correctness.
Take Wikipedia, for example. The free-to-read, free-to-edit online encyclopedia has built a massive catalog of articles contributed piecemeal by millions of users. While the site has its share of detractors, studies by Nature, the Journal of Clinical Oncology, and others have found the source to have a level of reliability on par with Encyclopedia Britannica.
In other words, vast, anonymous crowds have compiled a thorough and reliable encyclopedia just about as well as a certified group of experts. And when it comes to breadth of topics covered, the free encyclopedia far outstrips its less agile rivals.
How does the wisdom of crowds apply to UX?
Remote usability testing is not so different from a guess-the-weight-of-the-cow contest. Sure, the participants aren’t actually competing, but each of them, with their varying knowledge, experience, and skill levels, contributes a new point of view that leads us closer to an accurate and precise evaluation of the subject at hand.
But are they better than experts? At some things, they certainly can be (after all, none of the cattle experts guessed within a pound of the prize ox’s true weight). Aldo Matteucci has this to say:
“Why are experts not that smart? Because experts tend to be and think alike, and thus do not reflect maximum diversity of opinions.”
That’s not to say that experts don’t have anything to offer; on the contrary, the wealth of deeper understanding, analytical thinking, and problem resolution skills that a UX expert brings to the table are great tools.
But they, too, are human, subject to their own personal biases and the biases of their field, caught in the bubble of their own minds. No individual, no matter their expertise, can compete with the crowd for completeness and all-encompassingness. There are too many angles for one person’s opinion to be 100% accurate; aggregation will always be able to achieve a more perfect picture.
The next step
How can we maximize what we learn from the crowd? TryMyUI will soon be introducing a brand new feature called UXCrowd, a system aimed at identifying and prioritizing usability stress points by harnessing the wisdom of crowds, and even using the crowd as a reservoir of innovative usability solutions.
Here’s how it works:
At the end of the test, each tester will be asked for 3 things they liked and didn’t like about the website as well as other suggestions they would offer.
Then, they will be shown a compiled list of responses from everyone who has taken the same test, and have the ability to vote up or down on these responses based on whether they agree or disagree. They will also be able to comment on other responses if they desire.
Vote counts will not be visible to the testers, so as to avoid social influencing like groupthink or bandwagoning that undermines the wisdom of crowds. Test owners, of course, will see the complete list of answers ranked by total votes, essentially getting a prioritized to-do list for improving their website’s UX.
That’s what listening to the crowd can achieve, and we’re excited to put it into action. So next time you decide to weigh your ‘ox’ and trim the fat, remember where wisdom lies.
Recall is dead: Are you peeking into your customers’ heads?
[…] Thoughts on listening to users « Experts or the Crowd: Who has the UX advice you need? […]
The Fall of Recall - Research Access
[…] cursory reading of Wikipedia (Why you can trust Wikipedia – and crowdsourcing in general) reveals a number of factors that exert a major influence on recall accuracy, […]