Crowdsourcing
From Wikipedia, the free encyclopedia
This article or section has multiple issues. Please help improve the article or discuss these issues on the talk page.
|
Crowdsourcing is a neologism for the act of taking a task traditionally performed by an employee or contractor, and outsourcing it to an undefined, generally large group of people or community in the form of an open call. For example, the public may be invited to develop a new technology, carry out a design task (also known as community-based design[1] and distributed participatory design), refine or carry out the steps of an algorithm (see Human-based computation), or help capture, systematize or analyze large amounts of data (see also citizen science).
The term has become popular with business authors and journalists as shorthand for the trend of leveraging the mass collaboration enabled by Web 2.0 technologies to achieve business goals. However, both the term and its underlying business models have attracted controversy and criticism.
Contents |
[edit] History
The word was first coined by Jeff Howe in a June 2006 Wired magazine article.[2] Though the term is new there are examples of projects being run on similar models for some time. Recently, the Internet has been used to publicize and manage crowdsourcing projects.
[edit] Overview
Crowdsourcing is a distributed problem-solving and production model. Problems are broadcast to an unknown group of solvers in the form of an open call for solutions. Users--also known as the crowd--typically form into online communities, and the crowd submits solutions. The crowd also sorts through the solutions, finding the best ones. These best solutions are then owned by the entity that broadcast the problem in the first place--the crowdsourcer--and the winning individuals in the crowd are sometimes rewarded. In some cases, this labor is well compensated, either monetarily, with prizes, or with recognition. In other cases, the only rewards may be kudos or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time, or from experts or small businesses which were unknown to the initiating organization.[3]
Perceived benefits of crowdsourcing include:
- Problems can be explored at comparatively little cost, and often very quickly.
- Payment is by results.
- The organization can tap a wider range of talent than might be present in its own organization.
The difference between crowdsourcing and ordinary outsourcing is that a task or problem is outsourced to an undefined public rather than a specific other body. The difference between crowdsourcing and open source is that open source production is a cooperative activity initiated and voluntarily undertaken by members of the public. In crowdsourcing the activity is initiated by a client and the work may be undertaken on an individual, as well as a group, basis.[4] Other differences between open source and crowdsourced production relate to the motivations of individuals to participate.[5][6]
Though a business model, crowdsourcing also has potential to be a problem solving mechanism for government and non-profit use.[7] Urban and transit planning are prime areas for crowdsourcing,[8] with a project to test crowdsourcing the public participation process for transit planning in Salt Lake City underway in 2008-2009 funded by a U.S. Federal Transit Administration grant.[9] Another notable application of crowdsourcing to government problem solving is the Peer to Patent Community Patent Review project for the U.S. Patent and Trademark Office.[10]
[edit] Recent examples
This section may contain excessive, poor or irrelevant examples. You can improve the article by adding more descriptive text and removing less pertinent examples. See Wikipedia's guide to writing better articles for further suggestions. (August 2008) |
- Distributed Proofreaders (commonly abbreviated as DP or PGDP) is a web-based project launched by Project Gutenberg that supports the development of e-texts for Project Gutenberg by allowing many people to work together in proofreading drafts of e-texts for errors.
- InnoCentive, started in 2002, crowdsources research and development for biomedical and pharmaceutical companies, among other companies in other industries. InnoCentive, provides connection and relationship management services between "Seekers" and "Solvers." Seekers are the companies searching for solutions to critical challenges. Solvers are the 125,000 registered members of the InnoCentive crowd who volunteer their solutions to the Seekers. Anyone with interest and Internet access can become an InnoCentive Solver. Solvers whose solutions are selected by the Seekers are compensated for their ideas by InnoCentive, which acts as broker of the process. InnoCentive recently partnered with the Rockefeller Foundation to target solutions from InnoCentive's Solver crowd for orphan diseases and other philanthropic social initiatives.[11]
- Emporis, a provider of building data, has run the Emporis Community (a website where members can submit building information) since May 2000. Today, more than 1,000 members contribute building data throughout the world.
- The ESP Game by Luis von Ahn (later acquired by Google and renamed Google Image Labeler) was launched in 2004 and gets people to label images as a side-effect of playing a game. The image labels can be used to improve image search on the Web. This game led to the concept of Games with a purpose.
- reCAPTCHA is used for digitizing old texts, by providing the text (that can't be deciphered properly by OCR software) to be read by end users of a CAPTCHA spam filter -- those squiggly words that users must type on the Web when trying to "prove" they are human. reCAPTCHA is helping to digitize over 30 million words per day from the Internet Archive and the New York Times archive. Over 200 million people have helped digitize at least one word using this system.[12]
- Since 2004, MoveOn.org has applied crowdsourcing to a variety of challenges related to organizing a political movement including phonebanking, field organizing via house parties, and the creation of ads against opponents.
- Oxfam_Novib (Netherlands) mid 2008 launched a crowdsourcing initiative named Doeners.net, meant for people to support the organisation's campaigning activities.
- In 2005, Amazon.com launched the Amazon Mechanical Turk, a platform on which crowdsourcing tasks called "HITs" (Human Intelligence Tasks") can be created and publicized and people can execute the tasks and be paid for doing so. Dubbed "Artificial Artificial Intelligence", it was named after The Turk, an 18th century chess-playing "machine".
- Stardust@Home is an ongoing citizen science project, begun in 2006, utilizing internet volunteer "clickworkers" to find interstellar dust samples by inspecting 3D images from the Stardust spacecraft.
- Innovation Exchange is an open innovation vendor which emphasizes community diversity; it sources solutions to business problems from both experts and novices. Companies sponsor challenges which are responded to by individuals, people working in ad-hoc teams, or by small and midsize businesses. In contrast to sites focused primarily on innovation in the physical sciences, Innovation Exchange fosters product, service, process, and business model innovation.
- In 2006, the American online DVD rental company Netflix announced that they were offering a $1,000,000 prize for anybody who could improve their existing DVD rating system by at least 10%. Contest participants can download vast amounts of anonymized data from Netflix to test their proposals. In addition to the big prize Netflix is offering annual progress prizes of $50,000. So far 17,000 attempts have been submitted, the best showing an improvement of 9.65% over Netflix’s current system.
- The Democratic National Committee launched FlipperTV in November 2007 and McCainpedia in May 2008 to crowdsource video gathered by Democratic trackers and research compiled by DNC staff in the hands of the public to do with as they choose — whether for a blog post, to create a YouTube video, etc. [13]
- Shutterstock uses crowdsourcing to generate its library of over 5.7 million royalty free stock images. [14]
- Cambrian House applies a crowdsourcing model to identify and develop software and web-based businesses. Using a simple voting model, they try to find sticky software ideas that can be developed using a combination of internal and crowdsourced skills and effort.
- Threadless, an Internet-based clothing retailer, sells T-shirts which have been designed and rated by members of the public.
- Galaxy Zoo, a citizen science project that lets members of the public classify a million galaxies from the Sloan Digital Sky Survey.
- The British organisation MyFootballClub announced, on 17 November 2007, that it had reached an agreement in principle, pending a due diligence investigation and members vote, to purchase Ebbsfleet United F.C. of the Blue Square Premier. The group will let its members vote on the composition of the team, the starting lineup, player transfers, and other matters.[15]
- The Canadian gold mining group Goldcorp made 400 megabytes of geological survey data on its Red Lake, Ontario, property available to the public over the Internet. They offered a $575,000 prize to anyone who could analyze the data and suggest places where gold could be found. The company claims that the contest produced 110 targets, over 80% of which proved productive; yielding 8 million ounces of gold, worth more than $3 billion. The prize was won by a small consultancy in Perth, Western Australia, called Fractal Graphics.
- In June 2007, Chicago Public Radio launched Vocalo.org, which generates its content from contributions by an online community.
- In January 2008, the State of Texas announced it would install 200 mobile cameras along the Texas-Mexico border, to enable anyone with an Internet connection to watch the border and report sightings of alleged illegal immigrants to border patrol agents.[16]
- Zeros 2 Heroes Media, a Canadian crowdsourcing site, allows unpublished comic book writers and their pitch to be selected for production. Crowdsourcing on the site also led to the relaunch of the ReBoot animated TV series in comic form. 16 projects from various writers have been successfully pitched and selected as of June 2008 based on user votes.
- Wikipedia is often cited as a successful example of crowdsourcing,[17] despite objections by co-founder Jimmy Wales to the term.[18]
- The search for aviator Steve Fossett, whose plane went missing in Nevada in 2007, in which up to 50,000 people examined high-resolution satellite imagery from DigitalGlobe that was made available via Amazon Mechanical Turk. The search was ultimately unsuccessful.[19][20] Fosset's remains were eventually located by more traditional means[21].
- RYZ is a crowdsourcing footwear company launched in June 2008. The company allows any member to submit designs and critique and vote on other members' designs. Top vote-getters are produced as actual shoes.[22]
- Foldit invites the general public to play protein folding games to discover folding strategies.
- Crowdsourcing for the contribution of videos in Video communities such as YouTube [23]
- 99designs, started in 2008, is online marketplace for crowdsourcing custom design work such as logos, websites, business cards or other graphical elements.
[edit] Controversy
The ethical, social, and economic implications of crowdsourcing are subject to wide debate. For example, author and media critic Douglas Rushkoff, in an interview published in Wired News, expressed ambivalence about the term and its implications.[24] Wikipedia founder Jimmy Wales is also a vocal critic of the term.[25]
Some reports have focused on the negative effects of crowdsourcing on business owners, particularly in regard to how a crowdsourced project can sometimes end up costing a business more than a traditionally outsourced project.
Some possible pitfalls of crowdsourcing include:
- Added costs to bring a project to an acceptable conclusion.
- Increased likelihood that a crowdsourced project will fail due to lack of monetary motivation, too few participants, lower quality of work, lack of personal interest in the project, global language barriers, or difficulty managing a large-scale, crowdsourced project.
- Below-market wages.[26], or no wages at all. Barter agreements are often associated with crowdsourcing.
- No written contracts, non-disclosure agreements, or employee agreements or agreeable terms with crowdsourced employees.
- Difficulties maintaining a working relationship with crowdsourced workers throughout the duration of a project.
- Susceptibility to faulty results caused by targeted, malicious work efforts.
[edit] Historical examples
- The Alkali Prize
- The Longitude Prize
- Fourneyron's Turbine
- Montyon Prizes
- Nicolas Appert and food preservation
- Loebner Prize
- Millennium Prize Problems
[edit] See also
- Buzzwords
- Citizen science
- Clickworkers
- Collective intelligence
- Configuration system
- Crowdcasting
- Distributed Computing
- Human Computation
- The Long Tail
- Mass Collaboration
- Mass Customization
- Open Innovation
- Social commerce
- Toolkits for User Innovation
- Tuangou
- Wikinomics
- Wisdom of Crowds
- The Power of Many
[edit] References
- ^ Crowd Sourcing Turns Business On Its Head
- ^ David Whitford (2007-03-22). "Hired Guns on the Cheap". Fortune Small Business. http://money.cnn.com/magazines/fsb/fsb_archive/2007/03/01/8402019/index.htm. Retrieved on 2007-08-07.
- ^ Jeff Howe (June 2006). "The Rise of Crowdsourcing". Wired. http://www.wired.com/wired/archive/14.06/crowds.html. Retrieved on 2007-03-17.
- ^ Daren C. Brabham. (2008). "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases", Convergence: The International Journal of Research into New Media Technologies, 14(1), pp. 75-90.
- ^ Daren C. Brabham. (2008). "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases", Convergence: The International Journal of Research into New Media Technologies, 14(1), pp. 75-90.
- ^ Daren C. Brabham. (2008). "Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application", First Monday, 13(6), available online at http://www.uic.edu/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2159/1969.
- ^ Daren C. Brabham. (2008). "Crowdsourcing as a Model for Problem Solving: An Introduction and Cases", Convergence: The International Journal of Research into New Media Technologies, 14(1), pp. 75-90.
- ^ Daren C. Brabham. (in press). "Crowdsourcing the Public Participation Process for Planning Projects", Planning Theory.
- ^ U.S. Department of Transportation Federal Transit Administration Public Transportation Participation Pilot Program. "PTP-3 FY 2008 Projects: Crowdsourcing Public Participation in Transit Planning", available online at http://www.fta.dot.gov/planning/programs/planning_environment_8711.html.
- ^ Peer to Patent Community Patent Review Project. "Peer to Patent Community Patent Review", at http://www.peertopatent.org/.
- ^ "The Rockefeller-InnoCentive Partnership". 2007. http://www.rockfound.org/initiatives/innovation/innocentive.shtml. Retrieved on 2007-11-17. The Rockefeller Foundation-InnoCentive partnership brings the benefits of InnoCentive model to those working on innovation challenges faced by poor or vulnerable people. The Rockefeller Foundation will pay access, posting and service fees on behalf of these new class of “seekers” to InnoCentive, as well as funding the awards to "problem solvers."
- ^ The reCAPTCHA Website
- ^ DNC. "McCainPedia". DNC. http://www.mccainpedia.org. Retrieved on 2008-05-19.
- ^ Howe, Jeff (2006-06-01). "Wired 6.06". Wired. http://www.wired.com/wired/archive/14.06/crowds.html. Retrieved on 2009-02-02.
- ^ Perry, Alex and Sinnott, John (2007-11-13). "Website agrees Ebbsfleet takeover". BBC Sport. http://news.bbc.co.uk/sport1/hi/football/teams/g/gravesend_and_northfleet/7089473.stm. Retrieved on 2007-11-13.
- ^ "Texas Governor finds $3 million for border cameras". 2007. http://www.khou.com/news/state/stories/khou071119_rm_bordercameras.1b1f3f6b.html. Retrieved on 2007-11-27.
- ^ Libert, Barry; Jon Spector (2008). We are Smarter than Me. Upper Saddle River, New Jersey: Wharton School Publishing. p. 3. ISBN 978-0-13-24479-4.
- ^ Lee, Ellen (2007-11-30). "As Wikipedia moves to S.F., founder discusses planned changes". San Francisco Chronicle. http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2007/11/30/BUOMTKNJA.DTL&hw=jimmy+wales&sn=001&sc=1000. Retrieved on 2008-02-19. "One of my rants is against the term "crowdsourcing," which I think is a vile, vile way of looking at that world. This idea that a good business model is to get the public to do your work for free - that's just crazy. It disrespects the people. It's like you're trying to trick them into doing work for free."
- ^ Steve Friess, 50,000 Volunteers Join Distributed Search For Steve Fossett, Wired News, 2007-09-11
- ^ Steve Friess, Online Fossett Searchers Ask, Was It Worth It?, Wired.com, 2007-11-06
- ^ http://www.guardian.co.uk/world/2008/oct/02/usa2
- ^ Raine, George (2008-07-20). "More businesses considering 'wisdom of crowds'". San Francisco Chronicle. Hearst Communications. http://www.sfchroniclemarketplace.com/cgi-bin/article.cgi?f=/c/a/2008/07/20/BUAF11OT6T.DTL. Retrieved on 2008-07-29.
- ^ Huberman Bernardo A., Romero Daniel M. and Wu Fang, Crowdsourcing, Attention and Productivity; Working Paper HP Labs, 2008-09-11
- ^ Cove, Sarah (2007-07-12). "What Does Crowdsourcing Really Mean?". Wired News (Assignment Zero). http://www.wired.com/techbiz/media/news/2007/07/crowdsourcing?currentPage=1. Retrieved on 2008-02-19.
- ^ McNichol, Tom (2007-07-02). "The Wales Rules for Web 2.0". Business 2.0. http://money.cnn.com/galleries/2007/biz2/0702/gallery.wikia_rules.biz2/index.html. Retrieved on 2008-02-19. "I find the term 'crowdsourcing' incredibly irritating," Wales says. "Any company that thinks it's going to build a site by outsourcing all the work to its users not only disrespects the users but completely misunderstands what it should be doing. Your job is to provide a structure for your users to collaborate, and that takes a lot of work."
- ^ Sherwood Stranieri (October 2006). "Beer Money: Mechanical Turk on Campus". Paylancers. http://paylancers.blogspot.com/2006/10/beer-money-mechanical-turk-on-campus.html. Retrieved on 2008-03-14.
[edit] External links
- A list of Crowdsourcing Sites
- Crowdsourcing: A People Business - An article discussing concepts from Jeff Howe's book Crowdsourcing from The Economist print edition, September 25, 2008.
- Crowdsourcing: tracking the rise of the amateur - crowdsourcing blog by Jeff Howe
- Moving the Crowd at iStockphoto: The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application, by Daren C. Brabham, First Monday, June 2, 2008.
- Crowdsourcing: consumers as creators, by Paul Boutin, Business Week, July 13, 2006.
- Secure Distributed Human Computation by Craig Gentry, Zulfikar Ramzan, and Stuart Stubblebine. Proceedings of the 6th ACM Conference on Electronic Commerce, 2005.
- Innovation in the Age of Mass Collaboration, by Don Tapscott and Anthony D. Williams, Business Week, February 1, 2007.
- Randy Burge: Internet allows us to resource the crowd, Albuquerque Tribune, April 9, 2007.
- Assignment Zero First Take: Wiki Innovators Rethink Openness: Citizendium, by Michael Ho for Assignment Zero and Wired, May 3, 2007.
- InnoCentive: Crowdsourcing Diversity: What starts with the crowd ends in research and development, Randy Burge interviews Alph Bingham, cofounder of InnoCentive, for Assignment Zero and Wired (magazine), May 18, 2007.
- The Hopkinson Report, Episode 19 - interview with Jeff Howe on his book Crowdsourcing