How I Used Crowdsourcing The Wrong Way And What You Can Learn From It
We often hear the expression “Wisdom of crowds.” And if you have read my articles, it will be apparent that I am an ardent fan of crowdsourcing. Crowdsourcing makes the argument that the aggregation of information produced by groups, result in decisions that are often better than those that are made by a single individual. However, to get better results, it is critical to use the right crowds in the right way.
I decided to use crowdsourcing to help develop the title for my book that is being released next month. To better enable the group conversation, I first developed a large number of potential titles that I felt may be appropriate.
To provide some context, the book contains 40 counterintuitive and controversial strategies for making innovation a repeatable process in any organization.
One of the tips is titled “Hire People You Don’t Like.” Due to its seemingly counterintuitive perspective, the publisher thought this might make a good title. To test out their theory, they mocked up a cover design that was as provocative as the title itself (see the graphic). In large letters, they showcased the obvious viewpoint of “Fire People You Don’t Like,” but then crossed out the term “fire” and replaced it with the more surprising word-twist “hire.”
It was time to get input from the “crowd.” In this case, I turned to my 1,000 Facebook followers to solicit their opinions. I posted the above-mentioned cover along with my list containing a number of other potential titles and requested the feedback of my friends.
Despite the many options submitted for consideration, 95 percent of the people immediately gravitated toward “Hire People You Don’t Like,” quickly dismissing the rest.
In that moment, the title was determined. Or was it?
Upon further review, I noticed that the responding crowd was composed of long-time friends, fellow speakers, a few innovation experts and a broad range of other people.
Although the vast majority selected the “fire/hire” name, it was determined that a title containing those specific words would appeal to human resources professionals who focus on recruiting. The few responding innovation experts duly noted that most companies looking to innovate would likely pass on this title. It would not appeal to my target audience: innovation experts. While provocative, it doesn’t speak to their needs.
Had I asked a more specific and targeted crowd—innovation experts, book industry experts, book marketing experts—I might have gotten a very different answer. And perhaps a more useful one. However, at this point, that was not an option.
So we eliminated “Hire People You Don’t Like” from the list and went back to the crowd. Again, we asked them to vote for the titles that they liked best, but sadly there was little convergence. No one could agree on which title would work.
But based on comments, we started to see an interesting pattern: there was convergence on which titles did NOT work. Therefore, instead of using the crowd to identify the winning title, we used them to help eliminate the duds. They were extremely effective at this.
This allowed us to reduce our long list to a much shorter one that could then be reviewed by a small, yet select team of experts.
In the end, I enlisted the help of two individuals who had a solid understanding of book marketing, innovation and my objectives. Both independently agreed on one of the previously suggested titles: “Best Practices Are Stupid.” This still possessed the controversial edge we were seeking, but seemed to appeal more to my target market. The publisher agreed.
In this scenario, I had initially identified an inappropriate crowd for my needs. Although this particular group’s opinions proved to be less effective in determining the best title, they were in fact quite helpful in eliminating the bad ones. This insight could lead to some very beneficial practices for businesses to consider as many still succumb to crowdsourcing pitfalls similar to what I had experienced.
When companies use internal voting systems, they are, in essence, asking a generic crowd for their opinions. Yes, employees may have some background on the organization, but these individuals often see only small slices of the big picture and may not be best at determining what will be most effective…