Crowdsourcing, a concept that gained traction in the mid-2000s, has become a cornerstone of open innovation. It enables companies to extend their innovation capabilities by tapping into the collective intelligence of the public. In short, crowdsourcing is about “sourcing from the crowd.”
But recent research shows that not all crowdsourcing is equal. To be effective, the type of crowdsourcing must match the nature of the problem. This is the central argument of the academic study by Gurca, Bagherzadeh, and Velayati (Technovation, 2022), titled “Aligning the Crowdsourcing Type with the Problem Attributes to Improve Solution Search Efficacy” DOI: https://doi.org/10.1016/j.technovation.2022.102613 .
Understanding Crowdsourcing: The Fundamental Questions
To properly design a crowdsourcing initiative, companies must answer two key questions:
- What are we sourcing? → Typically: ideas, solutions, feedback, or even simple insights.
- From whom are we sourcing? → From a large, diverse group of people. Without the crowd, there is no crowdsourcing.
Real-World Examples of Crowdsourcing
Here are several modern use cases demonstrating crowdsourcing in action:
1. Product Innovation
Example: LEGO Ideas
The LEGO Ideas platform lets fans propose new product designs. Projects with 10,000 votes are reviewed by LEGO and can be turned into official sets.
2. Advertising Campaigns
Example: Doritos – “Crash the Super Bowl”
Doritos launched an open contest asking fans to submit commercials for the Super Bowl. The best ads aired during the event, with prizes reaching $1 million.
3. Strategic Communication
Example: Mozilla Builders
Mozilla’s Builders Program encouraged technologists to propose tools for a healthier internet. The initiative leveraged collective input to reimagine web-based communication.
4. Brand Reputation & Social Impact
Example: Patagonia Action Works
Through Patagonia Action Works, the company connects its customers with local environmental NGOs, strengthening its eco-conscious brand image.
Study Spotlight: Matching Crowdsourcing Type with Problem Attributes
The study by Gurca et al. (2022) analyzes why the classic crowdsourcing approach—labeled “fishing”—often fails, and introduces two alternative models: collective production and hunting.
“Fishing” Crowdsourcing – The Classic Approach
Fishing-type crowdsourcing broadcasts a problem widely and waits for voluntary contributions. Though widely used (63% of firms in the U.S. rely on this model), it has serious limitations:
- Poor fit for broad or complex problems (e.g., climate change, pandemics)
- Too slow for urgent situations
- Ineffective for highly technical problems that require expert knowledge
Case Example: BP Deepwater Horizon Oil Spill
In 2010, BP received over 43,000 public submissions for solving the oil spill. The result? Few useful ideas and major screening issues.
“Collective Production” – Co-Creating Complex Solutions
This model encourages knowledge exchange between participants, making it ideal for multi-faceted or undefined problems.
Case Example: Landcare Research (New Zealand)
Landcare launched a 10-day online challenge for pest control. Citizens contributed diverse insights, resulting in a wide range of solutions—from drones to biological methods.
“Hunting” – Proactively Seeking Experts
Instead of waiting, hunting involves actively identifying and contacting experts, using data mining tools and academic/patent databases.
Case Example: Alstom’s Leaf-Fall Problem on Train Tracks
Alstom worked with an open innovation intermediary to identify scientists capable of solving traction loss issues caused by fallen leaves. Two university research teams delivered viable solutions.
How to Choose the Right Type of Crowdsourcing
The study proposes a decision framework based on three key attributes:
| Problem Attribute | Recommended Crowdsourcing Type |
|---|---|
| Broad/Complex | Collective Production |
| Urgent | Hunting |
| Highly Technical | Hunting |
| Simple & Low-Tech | Fishing |
Best practice: Use crowdsourcing sequentially. Start with collective production to define the problem, then apply hunting or fishing for implementation.
The Strategic Benefits of Crowdsourcing
- Cost-effective: Prize incentives can be tailored and kept under control.
- Access to diversity: Thousands of contributors yield unexpected insights.
- Stronger engagement: Crowdsourcing fosters emotional investment in the brand.
- Visibility boost: Campaigns often gain traction across social media and press.
Conclusion
Crowdsourcing is not a one-size-fits-all solution. To unlock its full potential, the type of crowdsourcing must align with the problem attributes. Thanks to the work of Gurca et al., companies can now make more informed decisions and avoid the pitfalls of mismatched strategies.