Login

Crowd Innovation: Hype or Help?

Strategies for Leveraging Crowds

Linus Dahlander and Henning Piezunka

Keywords

Crowdsourcing, Innovation, Distant Search

download pdf

Crowds are not inherently wise
It has been over a decade since it became popular to involve large groups of people beyond corporate boundaries in the creation of ideas for products or services. From technical problems to sports equipment, lifestyle products, or financial and public services, organizations increasingly sought to tap the knowledge of the crowd. The rapid growth of online platforms and the emergence of diverse online communities became an ideal resource from which to generate new product ideas or business solutions.
Crowdsourcing success stories abound, but so do stories of failure. Lego’s use of a crowd-based innovation strategy played a crucial role in reviving the struggling toy manufacturer.  Netflix likewise used crowdsourcing to improve the efficacy of its recommendation engine by 10%, attracting over 44,000 submissions. Starbucks launched MyStarbucksIdea.com in 2008, to get ideas from consumers; the company has so far received more than 100,000 submissions from consumers around the world. By contrast, the crowdsourcing platform Quirky went bankrupt in 2015 because it didn’t adequately vet the market potential for ideas that were too quirky, financing too many bizarre products (Wi-Fi-enabled egg trays, anyone?) with no commercial appeal. Another tricky field is the public contest where an organization invites the public to suggest names, flavors or advertising ideas. The unpredictable dynamic of crowds can lead to “crowdsourcing fails” as in the Boaty McBoatface case that received global media coverage in 2016. The United Kingdom’s Natural Environment Research Council (NERC) had invited the public to choose the name of its newest polar research vessel, never anticipating the awkward moniker that won the online poll.

Crowds are effective under the right set of conditions
Obviously, crowds can be – but are not always - effective. Crowds, after all, are composed of human beings and can display the same unpredictable tendencies as the set of individuals that comprise them. To use crowds effectively requires the alignment of several factors. These are: crowd composition, the right question at the right time, and the right analytic method applied to the responses. Crowd-based creativity can be seen as a natural resource. It takes specific skills to acquire it, harness it effectively and sustainably, and transform it into offerings that markets value. Just as oil companies don’t randomly drill holes and hope for the best, companies should not attempt crowdsourcing without deploying a solid framework from inception to completion. Based on a comprehensive review of the existing research, we devised a crowdsourcing framework for the successful involvement of crowds in the innovation process. It consists of four stages: Define, Broadcast, Attract and Select – the “DBAS” framework and in each face some key questions need to be addressed (see Figure 1).

 

Each decision along the DBAS pathway matters, and navigation of each stage can reinforce or undercut decisions made at the other stages. From the initial stage of task definition onwards, companies must coordinate steps through the maze of decisions that crowdsourcing entails.
For example, properly setting up the Broadcast stage demands that the problem first be well defined, to enable curation of optimal solutions a crowd is capable of supplying. Moving along the project pipeline, the Attract stage requires knowing what will motivate this crowd to become active and creative– information that should be collected at the previous Broadcast stage. And during the Select stage, the required resources will depend on the size and nature of the contribution pool cultivated at Broadcast and Attract.

Critical success factors for crowdsourcing projects
To analyze success and failure of virtual engagement tools and crowdsourcing projects, we collaborated with a private company to create a massive dataset in, that allowed us to study over 100,000 suggestions submitted to nearly 1,000 organizations. From this analysis we developed a set of guidelines. Below we describe critical factors that require special attention when implementing the DBAS framework.

Assessing the Level of Innovativeness
Not all crowdsourcing campaigns require innovative and novel contributions even though. crowdsourcing is associated with creativity and innovation. It can be enough to take the pulse of a customer community or ask customers to choose between a small number of familiar options. If a company seeks a high level of innovation from the crowd, it should design and broadcast the project to constrain the number of submissions within a manageable range. When crowdsourcing campaigns trigger a flood of responses, the more unusual ones are likely to be ignored. When crowdsourcing evaluators feel overwhelmed by the volume of submissions, they tend to prefer recognizable, eminently practical ideas and ignore novel, groundbreaking suggestions.

 

Paying Attention to Activity
Across all our data, the amount of attention crowdsourcing campaigners give to their contributors determines the success of their initiatives. The correlation is salient both for reactive attention, e.g. feedback to contributors, and for proactive attention, e.g. priming the flow of contributions with ideas submitted by the organizations themselves. Organizations that respond publicly to submitted suggestions (reactive attention) receive significantly more suggestions from external contributors than those that do not. Campaigns received significantly more contributions and higher quality ideas when organizers were consistently generous with both varieties of attention throughout the process. But those cases were few in our dataset, and especially for the slow-starting campaigns attention giving tended to start too late.

Dealing with Rejection
In our studies, contributors who received information about idea rejection were far more likely to participate in future crowdsourcing campaigns managed by the same organization Launching a successful crowdsourcing campaign means arousing many hopes that are destined to be disappointed. Over 90 percent of ideas from the crowd will not be used. Most companies failed to notify contributors about the fate of their submissions. When organizers took the time to respond in language that stylistically resembled the contributor’s own communications, the likelihood of future engagement was even higher. We concluded that far from pushing people away, rejections bonded recipients even more tightly to the host organization.

To continually improve the odds of success, crowdsourcing should be treated as an iterative process.

How to manage crowdsourcing projects successfully
Based on these findings a few concrete recommendations can improve the success of crowd-based innovation projects.

  • Select your crowd carefully
    Companies should be selective about who they invite to participate. If they seek truly novel solutions, it makes sense to build a few hurdles into the process to deter less committed contributors, thereby limiting the number of submissions and increasing the chance that groundbreaking ideas get enough attention.
     
  • Give to get: Share your own ideas
    Instead of waiting for ideas to be submitted, successful organizations foster engagement by posting ideas themselves and inviting people to discuss them. This proactive attention gives external contributors examples of the direction an organization wants to pursue; it also engenders trust by sharing internal information. Further, it empowers external contributors to evaluate the organizations’ own ideas and thus stimulates knowledge sharing, increasing potential motivation. Proactivity is a key to spurring submissions at the beginning to jump start the flow of ideas; this is especially the case for less lively and popular campaigns.
     
  • Show you care: Respond publicly to submissions
    First-time participants have no way to know whether the organization will notice their ideas. Feedback validates external contributors and motivates further contributions. It also indicates what types of suggestions the organization values, and helps the crowd understand what is appropriate. Newcomers especially value this form of reactive attention. If they learn through the program’s responses that the organization cares, participants become motivated to make full use of their fresh perspectives and share their ideas more openly.
  • Improve your practices vis-à-vis rejections
    Based on our findings, participants whose suggestions could not be implemented should not be neglected. In the interest of maintaining participation over the long run, it pays to inform participants about the fate of their submissions. This means it is important to design crowdsourcing initiatives to protect the resource with more value than any single innovative idea--the loyalty of crowd-project participants.

To continually improve the odds of success, crowdsourcing should be treated as an iterative process, like the rapid innovation practices for which Silicon Valley tech firms are famous. All crowd projects are different, but each one provides a possibility to learn what works and what doesn´t. The DBAS framework is therefore best thought of as a cycle; each misstep or victory contains lessons for the current campaign, and for all campaigns to come.

Authors

Henning Piezunka, Professor of Entrepreneurship and Family Enterprise, INSEAD, France, henning.piezunka@insead.edu

Linus Dahlander, Professor of Strategy, ESMT Berlin, Germany, linus.dahlander@esmt.org

Further Reading

Dahlander, L., & Piezunka H. (2014): “Open to suggestions: How organizations elicit suggestions through proactive and reactive attention”, Research Policy, Vol. 43 (5), 812-827.

Piezunka, H.; & Dahlander, L. (2015): “Distant search, narrow attention: How crowding alters organizations’ filtering of suggestions in crowdsourcing“, Academy of Management Journal Vol. 58 (3), 856-880.

Dahlander, L.; Jeppesen L.B.; & Piezunka, H. (2019): “How organizations manage crowds: Define, broadcast, attract and select”, Research in the Sociology of Organization, Vol. 64, 239-270.

https://knowledge.insead.edu/strategy/define-broadcast-attract-and-selec...
https://hbr.org/2017/02/why-some-crowdsourcing-efforts-work-and-others-dont