The Open Innovation Formula?

July 29, 2009  

Today we have a guest blogger. In an earlier entry, I held a contest to see who could guess how many ideas were submitted as part of the LG Electronics competition. One of the respondents provided a detailed assumption analysis and complex equations. Although the assumptions proved inaccurate (read the actual answers), I felt that his thought process provided an interesting perspective on crowdsourcing and wanted to share it with you.

Introducing Rishi Bhalerao…

—————————————————

Steve and I have exchanged e-mails on the methodology to estimate the number of entries and related metrics in the LG challenge. Even though my numbers ended up quite far from the observed data, Steve thought the process was worth describing because it raised some interesting questions. Here is the “Cliff Notes” version of the method; a more detailed explanation is included in the attached pdf file.

Steve asked for an estimate of the number of entries and the number of unique participants. Here are the steps I followed:

  1. I assumed that the larger the payoff, the more entries a contest will attract. The public data available on crowdSPRING (the online platform used to run the LG Electronics competition) is too limited to allow a regression analysis, so let’s simplify and just assume “Pro” projects—those with a minimum $1000 payoff—attract more entries than “Non Pro” projects with smaller payoffs.
  2. The blog entry crowdSPRING By The Numbers indicates that, since launch, 4700 projects have attracted 370,002 entries (average: 79 entries/project). The website lists active projects (158, when I checked on July 23) and also indicates the number of entries received by each Non Pro project. From this sample, we can calculate the average number of entries per Non Pro project. We can then deduce, assuming the sample is representative, that the average number of entries for a Pro project is ~97.
  3. The next big assumption (which turned out to be flat wrong) is that each payoff attracts a separate set of entries. So, if this project has 43 payoffs, it will attract 43X the number of entries compared to an average Pro project with a single payoff. There were 3 big prizes and 40 consolation prizes. First place received $20,000, second place received $10,000, third place received $5,000 and 40 entries got $1,000 plus an LG phone.
  4. The number of unique participants was somewhat tricky, because the average entries/participant cannot easily be guessed from the sample. I took a SWAG from my experience in customer service: a single customer generates 1.2 transactions. I didn’t have a better benchmark–particularly not one that was relevant to crowd-sourced graphic design. Turns out the actual entries per participant are ~2.6 (834 entries from 324 participants) which suggests the marginal cost of a second entry is much lower than that of the first.

How accurate were the assumptions? Not very! But, they do raise interesting questions:

  • What explains the relatively low number of entries? The max $20K payoff was 45X the average ($444 or $2 million over 4,500 completed projects) but it attracted just 11X the average participation (824 entries instead of 79). 324 participants represent less than 1% of the 36K user base advertised on the site.
  • How does the number of payoffs affect participation? Clearly, entrants weren’t motivated by 43 separate payoffs. More likely, they were looking at the top two or three prizes. Could LG have received the same number of entries with a significantly smaller payoff (e.g. by not offering the 40 $1K prizes)? Do such contests have value beyond solving the stated design challenge? E.g. brand affinity, perception of engagement, mind share, etc?
  • What lessons should crowd-sourcing platforms draw on size of participant networks vs. level of engagement? Are more entries always better?

—————————————————

Rishi, thanks for sharing your thoughts on this. I love your thinking and want to add a few thoughts of my own.

  • Challenge Complexity – The complexity of the challenge has something to do with the number of submissions. Logo competitions receive a lot of entries because they can be created in a matter of minutes. A phone design is much more complex and might be perceived as too time intensive. As a parallel, although I don’t know this to be true, I suspect that the $1 million InnoCentive challenge to find an ALS marker has had fewer entries than challenges with a $10K reward. The complexity to reward ratio must be considered.
  • Solver Base – Many of the people registered on crowdSPRING are logo/graphic designers. They may not be equipped to solve more complex problems. Although the 36K solvers might be great at logos, they may be less skilled at phone design leading to fewer submissions. This may account for why InnoCentive has such a high solve rate on technical challenges that are in their “sweet spot.” They have 180,000 highly technical solvers. However, if you posted a logo design on InnoCentive, I suspect you would have very few entries.
  • Awareness - Of course, awareness of the challenge must be factored in. If people are not aware of the challenge, they cannot respond. LG Electronics did a fantastic job of spreading the word. Having said that, I know of many people who would have entered, but unfortunately were not aware of the competition until it was too late.
  • Challenge Definition – There are other factors that need to be considered. For example, when I did a logo design competition on 99designs (a competitor of crowdSPRING) I had a few entries in the beginning. But as I provided feedback and people saw what I liked, the numbers of entries exploded. The fuzzier the challenge, the harder it is for people to get their hands around it. This is why InnoCentive invests A LOT of time helping frame the challenge before posting.
  • Intangibles – One thought I had is that incentives can sometimes be a barrier to innovation. A saw a fantastic comment on a discussion thread about crowdSPRING from yongfook. He said, “The quality of open source software is high, not because of crowdsourcing, but because of personal motivation to be involved with a project you have some emotional connection to. With (crowdSPRING) the motivation is more monetary… pretty much every professional designer I know wouldn’t touch any of these freelancing sites with a barge pole.” Interesting.

Ok, what do all of you think? What factors are we missing? Can you develop an accurate formula to predict participation in a crowdsourcing/open innovation challenge?

Leave a Reply

Old Comments

3 Responses to “The Open Innovation Formula?”

  1. Vincent Carbone on July 29th, 2009 7:24 pm

    what does the formula generate for this open innovation site: ideas.acrobat.com

    I would be happy to provide some numbers to help validate your equation.

  2. Steve Shapiro on August 3rd, 2009 8:45 am

    Vincent. Thanks for your comment. Rishi and I are playing around with the idea of refining the formula. Clearly there are flaws in it right now. But maybe when we have the next generation ready, we can plug in the Acrobat data. Or maybe, if you share this with us, we can use it to help refine the equation… Thanks!

  3. uberVU - social comments on March 18th, 2010 8:44 pm

    Social comments and analytics for this post…

    This post was mentioned on Twitter by 2rz: RT @stephenshapiro: The Open Innovation Formula?: Today we have a guest blogger. http://bit.ly/zPQlq (me!)…