Crowdsourcing is fast emerging as a mainstream innovation channel for companies. It seems like the crowd has an answer to all sorts of innovation problems – they can come up with ideas for new toys and generate solutions to pressing scientific challenges. In theory, the crowd holds tremendous potential: A large, diverse group of people, consisting of experts and others from all over the world, should have fresh perspectives to bring about breakthrough insights on a given problem.
In practice, however, most crowdsourcing initiatives end up with an overwhelming amount of useless ideas. Consider BP’s crowdsourcing initiative. When a 2010 explosion on the Deepwater Horizon rig caused the largest oil spill in history, a desperate BP turned to the public to find ways to clean it up. The company received around 123,000 ideas from more than 100 countries within just a few weeks. Sifting these ideas was an enormous undertaking, and most of them turned out to be completely unusable – it was described as “a lot of effort for little result.”
Dealing with a full submission box is not only extremely time consuming and costly, it also biases how ideas are selected: When firms receive too many ideas, they tend to focus on ideas that are already familiar to them, defeating the entire purpose of crowdsourcing, which is to surface new thinking.
Why do many crowdsourced ideas turn out so bad, and what can firms do about it? My recent research finds that it comes down to understanding the motivations of crowd members.
The research drew on qualitative and quantitative data from InnoCentive — one of the largest global crowdsourcing platforms for innovation. I started by conducting a series of interviews with platform members and employees of the company, and analyzing content created by members online (e.g., in company blog, forum, and social media pages). I then ran a survey with 646 members who submitted at least one solution recently, asking them about themselves (e.g., education level, expertise etc.) and their motivations for submitting solutions. I matched survey responses with data the company collected on the quality of their solutions and other aspects of the crowdsourcing challenge (e.g., duration, number of submissions, prize size).
I found that crowd members differ greatly in terms of why they participate. Some take part because they genuinely love creative problem-solving (what’s called “intrinsic motivation”). Others participate because they want to learn new things (“learning motivation”), make a positive impact on others (“prosocial motivation”), or be part of a social community (“social motivation”). Not surprisingly, some members focus predominantly on winning the prize money or other benefits such as recognition and better career prospects (“extrinsic motivation”).
The results also showed that these motivations have different effects on solution quality. Intrinsic and extrinsic motivations were associated with higher-quality solutions, whereas learning and prosocial motivations were negatively related to solution quality. Social motivation was not a significant predictor of the quality of ideas.
One explanation for these findings is that some motivations direct more attention and effort toward addressing constraints of the problem (like technical requirements and specified goals), and this leads to more valuable solutions. For example, intrinsic motivation may bring about a stronger focus on the problem and its details, as the problem itself is the main reason for engagement. Likewise, extrinsically motivated people may attend to the problem details because winning rewards often depends on meeting problem constraints. In contrast, focusing on learning, doing good, or being a part of a community might lead to solutions that are off the mark, as the focus is on other aspects aside from the main problem.
So in order to avoid a flood of useless ideas, firms should consider designing their crowdsourcing initiatives in a way that encourages those with intrinsic and extrinsic motivation, while making prosocial and learning benefits less salient.
To promote intrinsic motivation, managers could, for example, highlight the joy of problem solving, provide positive feedback, and set the right level of constraints in their crowdsourcing initiatives. They could also offer various extrinsic rewards. A sufficiently large cash prize is important, but it also pays to think above and beyond money: Non-pecuniary extrinsic benefits, such as recognition and career advancement, are also important drivers of quality. Examples include featuring winning solutions and members on various outlets (e.g., the platform itself, company blogs, social media pages etc.) or making use of gamification tools such as leaderboards and status badges to acknowledge successful members.
While managers could downplay the learning and social opportunities (e.g., getting access to experts, mentors, or resources) involved in crowdsourcing, it is important to consider the overall goal of the initiative before discounting these motivations altogether. Learning motivation is not necessarily counterproductive when an initiative mainly relies on repeated participation of members over time (e.g., the Lego Ideas platform), because members can use what they’ve learned to develop better ideas in future activities. Likewise, despite generating low-quality ideas, prosocially motivated members may still create value when ideas are generated collectively (e.g., the OpenIDEO platform), as they are likely to go the extra mile to help others improve their ideas.
Implementing these insights can help managers get an ideal solution set from their crowdsourcing efforts – one that includes a more manageable number of solutions without excluding any that may have at least some chance of offering a breakthrough for the innovation problem at hand.
Next time you design a crowdsourcing initiative, keep in mind what makes the crowd tick and what it means for their ideas. The key to harnessing innovative potential of crowdsourcing is not motivating everyone and getting the largest possible number of ideas, but designing an incentive structure that attracts the right people. Less may well be more when it comes to getting the best value out of crowdsourcing.