Boaty McBoatface- Lessons From Failed Crowdsourcing

Britain’s NERC elected to ‘crowd-source’ the name of its new $300mm research vessel. The Public responded with 124,000 votes for “Boaty McBoatface.” As The Atlantic reported, “The people of the Internet had spoken, and they’d spoken like a five-year-old.”

In 2016, Britain’s Environmental Research Council (NERC) elected to ‘crowdsource’ the name of its new 300-million-dollar, state of the art, research vessel.  The Public responded with 124,000 votes (roughly 85% of votes cast) for “Boaty McBoatface.”  As The Atlantic reported, “The people of the Internet had spoken emphatically, and they’d spoken like a five-year-old.”[i]

 

This episode highlighted an inherent downside of seeking the wisdom of the crowd.  Similar outcomes were seen when internet trolls, prompted by group chats in the popular website 4chan, voted a school for hearing impaired children as the winner of a VH1 contest to host a concert by pop-icon Taylor Swift.[ii]  Despite these well-known failures to harness the creative power of internet users; there have also been numerous examples of successful campaigns to help start businesses, (kickstarter), or continually design new fashion (threadless), and even a NASA project to make the International Space Station more safe for astronauts.[iii]

 

If there can be positive examples of how companies and organizations have asked the public for their ideas and not received ridicule in response, what then can explain the difference?  How can companies like Kickstarter and Threadless exists, while the NERC was stuck explaining to its voters how the research vessel would in fact be named the RRS Sir David Attenborough, a name that placed 4th in its contest, when Boaty McBoatface had received 3 times the number of votes than any other.[iv]?

 

Fortunately, a number of Harvard Business Review articles have been written on the topic.  Although varying in their support for the idea of crowdsourcing campaigns, a common thread of lessons can be extracted from the articles.

 

Vetting:

Although this is counter-intuitive to an idea designed to garner the greatest breadth of contributions, some level of vetting the community participating in the crowd-sourcing idea is always necessary.  Any simple attempts to request feedback from anonymous recipients is likely to invite non-serious responses and enable the trolling campaigns of sites like 4chan. Although vetting is necessary to some level, intrusive levels of screening will have the counter effect of dissuading users from participating. A very sensitive balance between outreach and vetting must be found for any successful crowd-sourced campaign.[v]

 

Community Engagement:

Once a company or organization has reached out to its intended population for support or ideas, it must also ensure that it is not a one-time interaction.  Processes should be put in place for the both the administrators of the discussion and the users themselves to have the ability to reply and interact with submissions.  While volume will be a hindrance to a thoughtfully reply for every post or submission, care should be given to individualize feedback wherever possible, making the participant feel engaged and valued by the organization.[vi]

 

Create and Align Incentives:

In order to encourage thoughtful and sincere participation in a crowd sourcing campaign, the designers should create an incentive system designed to reward the participants for chosen ideas. [vii]   Doron Reuveni writing for HBR believes these rewards can be classified into 3 categories of reward currencies; money, reputation, and increased skills.[viii]

 

[i] https://www.theatlantic.com/international/archive/2016/04/boaty-mcboatface-britain-democracy/479088/

[ii] https://www.nytimes.com/2016/03/22/world/europe/boaty-mcboatface-what-you-get-when-you-let-the-internet-decide.html?_r=0

[iii] https://hbr.org/2017/02/why-some-crowdsourcing-efforts-work-and-others-dont

[iv] https://www.theatlantic.com/international/archive/2016/05/boaty-mcboatface-parliament-lessons/482046/

[v] https://hbr.org/2013/08/setting-up-a-crowdsourcing-eff

[vi] https://hbr.org/2017/02/why-some-crowdsourcing-efforts-work-and-others-dont

[vii] https://hbr.org/2016/12/a-case-study-of-crowdsourcing-gone-wrong

[viii] https://hbr.org/2017/02/why-some-crowdsourcing-efforts-work-and-others-dont

Previous:

Ben & Jerry’s Taste for Crowdsourcing

Next:

Tradiio – Experimenting with Crowds in the Music Industry

Student comments on Boaty McBoatface- Lessons From Failed Crowdsourcing

  1. Interesting post Jack! I’d never heard of this instance of failed crowdsourcing but it brought to mind the idea of democracy (and perhaps the most recent election) as a failed crowdsourcing effort.

    It’s kind of a perverse thought experiment to think of democratic voting as subject to the same vetting, community engagement, and incentive alignment rules as crowdsourcing ideas (no one wants to advocate for actual “vetting” of voters in the US); however, one could make the argument that a democracy where voters are not informed properly really are not adding value to the process of electing an official (the equivalent of selecting the winning idea).

Leave a comment