Not all crowdsourcing challenges are created equal
New information technologies have allowed companies to tap into the creative potential, distributed work patterns, and expansive knowledge of huge online crowds. In various business fields, crowds can solve certain problems faster, better, and cheaper than companies are able to do in house. Today, according to a trend report published by the platform provider eYeka in 2015, 84 % of the world’s top companies – including SAP, Dell, Google, General Electric, Fiat, LEGO, and Procter & Gamble – have started to build their own crowdsourcing platforms. The crowd-sourced tasks, however, are highly diverse, as are crowdsourcing platforms. For instance, the Fiat Mio platform, where contributors collaborated to develop a new concept car, is completely different from the GE Ecomagination Challenge, where contributors compete against each other. In the case of Fiat Mio, contributions were small and reflected by sharing, commenting, editing, or integrating ideas for further developing the car in a collaborative fashion. In contrast, GE’s Ecomagination Challenge does not require substantial collaboration among contributors. It facilitates an innovation contest in which each contribution reflects an independent and exhaustive solution to a specific crowdsourced task. Of course, the different nature of the tasks demands different governance mechanisms. While collaboration is an important issue for Fiat Mio, mechanisms that permit the control and evaluation of a high number of alternative contributions will be a key challenge for GE.
Different types of crowdsourcing platforms
Crowdsourcing platforms fall into four categories, distinguished by the diversity and aggregation of their contributions (see Figure 1). The main goal of microtasking crowdsourcing platforms is the scalable and time-efficient batch processing of highly repetitive tasks, e.g., categorizing data or writing and translating small chunks of text. Crowdsourcing platforms for information pooling aggregate contributions such as votes, opinions, assessments, and forecasts through approaches such as averaging, summation, or visualization. Broadcast search platforms collect contributions to solve a task to gain alternative insights and solutions from people outside the organization. They are particularly suited for solving challenging technical, analytical, scientific, or creative problems. Frequently, broadcast search is applied to running different kinds of innovation, design, or data science contests. Finally, open collaboration platforms invite contributors to team up to jointly solve a complex problem where the solution requires the integration of distributed knowledge and the skills of many contributors. The individual contributions are aggregated such that one or more solutions to the underlying problem can emerge. In practice, however, pure forms of these archetypes are rare. Frequently, crowdsourcing platforms combine several traits.