fbpx
Wikipedia

Crowdsourcing

Crowdsourcing involves a large group of dispersed participants contributing or producing goods or services—including ideas, votes, micro-tasks, and finances—for payment or as volunteers. Contemporary crowdsourcing often involves digital platforms to attract and divide work between participants to achieve a cumulative result. Crowdsourcing is not limited to online activity, however, and there are various historical examples of crowdsourcing. The word crowdsourcing is a portmanteau of "crowd" and "outsourcing".[1][2][3] In contrast to outsourcing, crowdsourcing usually involves less specific and more public groups of participants.[4][5][6]

This graphic symbolizes the use of ideas from a wide range of individuals, as used in crowdsourcing.

Advantages of using crowdsourcing include lowered costs, improved speed, improved quality, increased flexibility, and/or increased scalability of the work, as well as promoting diversity.[7][8] Crowdsourcing methods include competitions, virtual labor markets, open online collaboration and data donation.[8][9][10][11] Some forms of crowdsourcing, such as in "idea competitions" or "innovation contests" provide ways for organizations to learn beyond the "base of minds" provided by their employees (e.g. LEGO Ideas).[12][13][promotion?] Commercial platforms, such as Amazon Mechanical Turk, match microtasks submitted by requesters to workers who perform them. Crowdsourcing is also used by nonprofit organizations to develop common goods, such as Wikipedia.[14]

Definitions

The term crowdsourcing was coined in 2006 by two editors at Wired, Jeff Howe and Mark Robinson, to describe how businesses were using the Internet to "outsource work to the crowd", which quickly led to the portmanteau "crowdsourcing".[15] Howe published a definition for the term in a blog post in June 2006:[16]

Simply defined, crowdsourcing represents the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined (and generally large) network of people in the form of an open call. This can take the form of peer-production (when the job is performed collaboratively), but is also often undertaken by sole individuals. The crucial prerequisite is the use of the open call format and the large network of potential laborers.

Daren C. Brabham defined crowdsourcing as an "online, distributed problem-solving and production model."[17] Kristen L. Guth and Brabham found that the performance of ideas offered in crowdsourcing platforms are affected not only by their quality, but also by the communication among users about the ideas, and presentation in the platform itself.[18] After studying more than 40 definitions of crowdsourcing in the scientific and popular literature, Enrique Estellés-Arolas and Fernando González Ladrón-de-Guevara, researchers at the Technical University of Valencia, developed a new integrating definition:[3]

Crowdsourcing can either take an explicit or an implicit route. Explicit crowdsourcing lets users work together to evaluate, share, and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing. With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work. Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are doing, whereas piggyback takes users' information from a third-party website to gather information.

Despite the multiplicity of definitions for crowdsourcing, one constant has been the broadcasting of problems to the public, and an open call for contributions to help solve the problem.[original research?] Members of the public submit solutions that are then owned by the entity who originally broadcast the problem. In some cases, the contributor of the solution is compensated monetarily with prizes or public recognition. In other cases, the only rewards may be praise or intellectual satisfaction. Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time, from experts, or from small businesses.[15]

Historical examples

While the term "crowdsourcing" was popularized online to describe Internet-based activities,[17] some examples of projects, in retrospect, can be described as crowdsourcing.

Timeline of crowdsourcing examples

  • 618–907 – Tang dynasty introduced the Joint-Stock Company, the earliest form of crowdfunding.
  • 1567 – King Philip II of Spain offered a cash prize for calculating the longitude of a vessel whilst at sea.
  • 1714 – The longitude rewards: When the British government was trying to find a way to measure a ship's longitudinal position, they offered the public a monetary prize to whoever came up with the best solution.[19]
  • 1783 – King Louis XVI offered an award to the person who could "make the alkali" by decomposing sea salt by the "simplest and most economic method".[19]
  • 1848 – Matthew Fontaine Maury distributed 5000 copies of his Wind and Current Charts free of charge on the condition that sailors returned a standardized log of their voyage to the U.S. Naval Observatory. By 1861, he had distributed 200,000 copies free of charge, on the same conditions.[20]
  • 1849 – A network of some 150 volunteer weather observers all over the USA was set up as a part of the Smithsonian Institution's Meteorological Project started by the Smithsonian's first Secretary, Joseph Henry, who used the telegraph to gather volunteers' data and create a large weather map, making new information available to the public daily. For instance, volunteers tracked a tornado passing through Wisconsin and sent the findings via telegraph to the Smithsonian. Henry's project is considered the origin of what later became the National Weather Service. Within a decade, the project had more than 600 volunteer observers and had spread to Canada, Mexico, Latin America, and the Caribbean.[21]
  • 1884 – Publication of the Oxford English Dictionary: 800 volunteers catalogued words to create the first fascicle of the OED.[19]
  • 1916 – Planters Peanuts contest: The Mr. Peanut logo was designed by a 14-year-old boy who won the Planter Peanuts logo contest.[19]
  • 1957 – Jørn Utzon was selected as winner of the design competition for the Sydney Opera House.[19]
  • 1970 – French amateur photo contest C'était Paris en 1970 ("This Was Paris in 1970") was sponsored by the city of Paris, France-Inter radio, and the Fnac: 14,000 photographers produced 70,000 black-and-white prints and 30,000 color slides of the French capital to document the architectural changes of Paris. Photographs were donated to the Bibliothèque historique de la ville de Paris.[22]
  • 1979 – Robert Axelrod invited academics on-line to submit FORTRAN algorithms to play the repeated Prisoner's Dilemma; A tit for tat algorithm ended up in first place.[23]
  • 1991 – Linus Torvalds began work on the Linux operating system, and invited programmers around the world to contribute code.[dubious ]
  • 1996 – The Hollywood Stock Exchange was founded: It allowed buying and selling of shares.[19]
  • 1997 – British rock band Marillion raised $60,000 from their fans to help finance their U.S. tour.[19]
  • 1999 – SETI@home was launched by the University of California, Berkeley. Volunteers can contribute to searching for signals that might come from extraterrestrial intelligence by installing a program that uses idle computer time for analyzing chunks of data recorded by radio telescopes involved in the SERENDIP program.
  • 2000 – JustGiving was established: This online platform allows the public to help raise money for charities.[19]
  • 2000 – UNV Online Volunteering service launched: Connecting people who commit their time and skills over the Internet to help organizations address development challenges.[24]
  • 2000 – iStockPhoto was founded: The free stock imagery website allows the public to contribute to and receive commission for their contributions.[25]
  • 2001 – Launch of Wikipedia: "Free-access, free content Internet encyclopedia".[26]
  • 2001 – Foundation of Topcoder – crowdsourcing software development company.[27][28]
  • 2004 – OpenStreetMap, a collaborative project to create a free editable map of the world, was launched.[29]
  • 2004 – Toyota's first "Dream car art" contest: Children were asked globally to draw their "dream car of the future".[30]
  • 2005 – Kodak's "Go for the Gold" contest: Kodak asked anyone to submit a picture of a personal victory.[30]
  • 2005 – Amazon Mechanical Turk (MTurk) was launched publicly on November 2, 2005. It enables businesses to hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do.[31]
  • 2006 – Waze (then named FreeMap Israel), a community-oriented GPS app, was created. It allows users to submit road information and route data based on location, such as reports of car accidents or traffic, and integrates that data into its routing algorithms for all users of the app.
  • 2010 – The 1947 Partition Archive, an oral history project that asked community members around the world to document oral histories from aging witnessed of a significant but under-documented historical event, the 1947 Partition of India, was founded.
  • 2011 – Casting of Flavours (Do us a flavor in the USA) – a campaign launched by PepsiCo's Lay's in Spain. The campaign was to create a new flavor for the snack where the consumers were directly involved in its formation.[32]

Early competitions

Crowdsourcing has often been used in the past as a competition to discover a solution. The French government proposed several of these competitions, often rewarded with Montyon Prizes.[33] These included the Leblanc process, or the Alkali prize, where a reward was provided for separating the salt from the alkali, and the Fourneyron's turbine, when the first hydraulic commercial turbine was developed.[34]

In response to a challenge from the French government, Nicolas Appert won a prize for inventing a new way of food preservation that involved sealing food in air-tight jars.[35] The British government provided a similar reward to find an easy way to determine a ship's longitude in the Longitude Prize. During the Great Depression, out-of-work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project.[36][unreliable source?] One of the largest crowdsourcing campaigns was a public design contest in 2010 hosted by the Indian government's finance ministry to create a symbol for the Indian rupee. Thousands of people sent in entries before the government zeroed in on the final symbol based on the Devanagari script using the letter Ra.[37]

Applications

A number of motivations exist for businesses to use crowdsourcing to accomplish their tasks. These include the ability to offload peak demand, access cheap labor and information, generate better results, access a wider array of talent than what is present in one organization, and undertake problems that would have been too difficult to solve internally.[38] Crowdsourcing allows businesses to submit problems on which contributors can work—on topics such as science, manufacturing, biotech, and medicine—optionally with monetary rewards for successful solutions. Although crowdsourcing complicated tasks can be difficult, simple work tasks[specify] can be crowdsourced cheaply and effectively.[39]

Crowdsourcing also has the potential to be a problem-solving mechanism for government and nonprofit use.[40] Urban and transit planning are prime areas for crowdsourcing. For example, from 2008 to 2009, a crowdsourcing project for transit planning in Salt Lake City was created to test the public participation process.[41] Another notable application of crowdsourcing for government problem-solving is Peer-to-Patent, which was an initiative to improve patent quality in the United States through gathering public input in a structured, productive manner.[42]

Researchers have used crowdsourcing systems such as Amazon Mechanical Turk or CloudResearch to aid their research projects by crowdsourcing some aspects of the research process, such as data collection, parsing, and evaluation to the public. Notable examples include using the crowd to create speech and language databases,[43][44] to conduct user studies,[45] and to run behavioral science surveys and experiments.[46] Crowdsourcing systems provided researchers with the ability to gather large amounts of data, and helped researchers to collect data from populations and demographics they may not have access to locally.[47][failed verification]

Artists have also used crowdsourcing systems. In a project called the Sheep Market, Aaron Koblin used Mechanical Turk to collect 10,000 drawings of sheep from contributors around the world.[48] Artist Sam Brown leveraged the crowd by asking visitors of his website explodingdog to send him sentences to use as inspirations for his paintings.[49] Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized.[50] As with other types of uses, artists use crowdsourcing systems to generate and collect data. The crowd also can be used to provide inspiration and to collect financial support for an artist's work.[51]

In navigation systems, crowdsourcing from 100 million drivers were used by INRIX to collect users' driving times to provide better GPS routing and real-time traffic updates.[52]

In science

Astronomy

Crowdsourcing in astronomy was used in the early 19th century by astronomer Denison Olmsted. After being awakened in a late November night due to a meteor shower taking place, Olmsted noticed a pattern in the shooting stars. Olmsted wrote a brief report of this meteor shower in the local newspaper. "As the cause of 'Falling Stars' is not understood by meteorologists, it is desirable to collect all the facts attending this phenomenon, stated with as much precision as possible", Olmsted wrote to readers, in a report subsequently picked up and pooled to newspapers nationwide. Responses came pouring in from many states, along with scientists' observations sent to the American Journal of Science and Arts.[53] These responses helped him to make a series of scientific breakthroughs including observing the fact that meteor showers are seen nationwide and fall from space under the influence of gravity. The responses also allowed him to approximate a velocity for the meteors.[citation needed]

A more recent version of crowdsourcing in astronomy is NASA's photo organizing project,[54] which asked internet users to browse photos taken from space and try to identify the location the picture is documenting.[55]

Behavioral science

In the field of behavioral science, crowdsourcing is often used to gather data and insights on human behavior and decision making. Researchers may create online surveys or experiments that are completed by a large number of participants, allowing them to collect a diverse and potentially large amount of data.[46] Crowdsourcing can also be used to gather real-time data on behavior, such as through the use of mobile apps that track and record users' activities and decision making.[56] The use of crowdsourcing in behavioral science has the potential to greatly increase the scope and efficiency of research, and has been used in studies on topics such as psychology research,[57] political attitudes,[58] and social media use.[59]

Energy system research

Energy system models require large and diverse datasets, increasingly so given the trend towards greater temporal and spatial resolution.[60] In response, there have been several initiatives to crowdsource this data. Launched in December 2009, OpenEI is a collaborative website run by the US government that provides open energy data.[61][62] While much of its information is from US government sources, the platform also seeks crowdsourced input from around the world.[63] The semantic wiki and database Enipedia also publishes energy systems data using the concept of crowdsourced open information. Enipedia went live in March 2011.[64][65]: 184–188 

Genealogy research

Genealogical research used crowdsourcing techniques long before personal computers were common. Beginning in 1942, members of the Church of Jesus Christ of Latter-day Saints encouraged members to submit information about their ancestors. The submitted information was gathered together into a single collection. In 1969, to encourage more participation, the church started the three-generation program. In this program, church members were asked to prepare documented family group record forms for the first three generations. The program was later expanded to encourage members to research at least four generations and became known as the four-generation program.[66]

Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indices to records.[citation needed]

Genetic genealogy research

Genetic genealogy is a combination of traditional genealogy with genetics. The rise of personal DNA testing, after the turn of the century, by companies such as Gene by Gene, FTDNA, GeneTree, 23andMe, and Ancestry.com, has led to public and semi public databases of DNA testing using crowdsourcing techniques. Citizen science projects have included support, organization, and dissemination of personal DNA (genetic) testing. Similar to amateur astronomy, citizen scientists encouraged by volunteer organizations like the International Society of Genetic Genealogy[67] have provided valuable information and research to the professional scientific community.[68] The Genographic Project, which began in 2005, is a research project carried out by the National Geographic Society's scientific team to reveal patterns of human migration using crowdsourced DNA testing and reporting of results.[69]

Ornithology

Another early example of crowdsourcing occurred in the field of ornithology. On 25 December 1900, Frank Chapman, an early officer of the National Audubon Society, initiated a tradition dubbed the "Christmas Day Bird Census". The project called birders from across North America to count and record the number of birds in each species they witnessed on Christmas Day. The project was successful, and the records from 27 different contributors were compiled into one bird census, which tallied around 90 species of birds.[70] This large-scale collection of data constituted an early form of citizen science, the premise upon which crowdsourcing is based. In the 2012 census, more than 70,000 individuals participated across 2,369 bird count circles.[71] Christmas 2014 marked the National Audubon Society's 115th annual Christmas Bird Count.

Seismology

The European-Mediterranean Seismological Centre (EMSC) has developed a seismic detection system by monitoring the traffic peaks on its website and analyzing keywords used on Twitter.[72]

In journalism

Crowdsourcing is increasingly used in professional journalism. Journalists are able to organize crowdsourced information by fact checking the information, and then using the information they have gathered in their articles as they see fit.[citation needed] A daily newspaper in Sweden has successfully used crowdsourcing in investigating the home loan interest rates in the country in 2013–2014, which resulted in over 50,000 submissions.[73] A daily newspaper in Finland crowdsourced an investigation into stock short-selling in 2011–2012, and the crowdsourced information led to revelations of a tax evasion system by a Finnish bank. The bank executive was fired and policy changes followed.[74] TalkingPointsMemo in the United States asked its readers to examine 3,000 emails concerning the firing of federal prosecutors in 2008. The British newspaper The Guardian crowdsourced the examination of hundreds of thousands of documents in 2009.[75]

Data donation

Data donation is a crowdsourcing approach to gather digital data. It is used by researchers and organizations to gain access to data from online platforms, websites, search engines and apps and devices. Data donation projects usually rely on participants volunteering their authentic digital profile information. Examples include:

  • DataSkop developed by Algorithm Watch, a non-profit research organization in Germany, which accessed data on social media algorithms and automated decision-making systems.[76][77]
  • Mozilla Rally, from the Mozilla Foundation, is a browser extension for adult participants in the US[78] to provide access to their data for research projects.[79]
  • The Australian Search Experience and Ad Observatory projects set up in 2021 by researchers at the ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S) in Australia was using data donations to analyze how Google personalized search results, and examine how Facebook's algorithmic advertising model worked.[80][81]
  • The Citizen Browser Project, developed by The Markup, was designed to measure how disinformation traveled across social media platforms over time.[82]

In public policy

Crowdsourcing public policy and the production of public services is also referred to as citizen sourcing. While some scholars argue crowdsourcing for this purpose as a policy tool[83] or a definite means of co-production,[84] others question that and argue that crowdsourcing should be considered just as a technological enabler that simply increases speed and ease of participation.[85] Crowdsourcing can also play a role in democratization.[86]

The first conference focusing on Crowdsourcing for Politics and Policy took place at Oxford University, under the auspices of the Oxford Internet Institute in 2014. Research has emerged since 2012[87] which focused on the use of crowdsourcing for policy purposes.[88][89] These include experimentally investigating the use of Virtual Labor Markets for policy assessment,[90] and assessing the potential for citizen involvement in process innovation for public administration.[91]

Governments across the world are increasingly using crowdsourcing for knowledge discovery and civic engagement.[citation needed] Iceland crowdsourced their constitution reform process in 2011, and Finland has crowdsourced several law reform processes to address their off-road traffic laws. The Finnish government allowed citizens to go on an online forum to discuss problems and possible resolutions regarding some off-road traffic laws.[citation needed] The crowdsourced information and resolutions would then be passed on to legislators to refer to when making a decision, allowing citizens to contribute to public policy in a more direct manner.[92][93] Palo Alto crowdsources feedback for its Comprehensive City Plan update in a process started in 2015.[94] The House of Representatives in Brazil has used crowdsourcing in policy-reforms.[95]

NASA used crowdsourcing to analyze large sets of images. As part of the Open Government Initiative of the Obama Administration, the General Services Administration collected and amalgamated suggestions for improving federal websites.[95]

For part of the Obama and Trump Administrations, the We the People system collected signatures on petitions, which were entitled to an official response from the White House once a certain number had been reached. Several U.S. federal agencies ran inducement prize contests, including NASA and the Environmental Protection Agency.[96][95]

Language-related data

Crowdsourcing has been used extensively for gathering language-related data.

For dictionary work, crowdsourcing was applied over a hundred years ago by the Oxford English Dictionary editors using paper and postage. It has also been used for collecting examples of proverbs on a specific topic (e.g. religious pluralism) for a printed journal.[97] Crowdsourcing language-related data online has proven very effective and many dictionary compilation projects used crowdsourcing. It is used particularly for specialist topics and languages that are not well documented, such as for the Oromo language.[98] Software programs have been developed for crowdsourced dictionaries, such as WeSay.[99] A slightly different form of crowdsourcing for language data was the online creation of scientific and mathematical terminology for American Sign Language.[100]

In linguistics, crowdsourcing strategies have been applied to estimate word knowledge, vocabulary size, and word origin.[101] Implicit crowdsourcing on social media has also approximating sociolinguistic data efficiently. Reddit conversations in various location-based subreddits were analyzed for the presence of grammatical forms unique to a regional dialect. These were then used to map the extent of the speaker population. The results could roughly approximate large-scale surveys on the subject without engaging in field interviews.[102]

Mining publicly available social media conversations can be used as a form of implicit crowdsourcing to approximate the geographic extent of speaker dialects.[102] Proverb collection is also being done via crowdsourcing on the Web, most notably for the Pashto language of Afghanistan and Pakistan.[103][104][105] Crowdsourcing has been extensively used to collect high-quality gold standards for creating automatic systems in natural language processing (e.g. named entity recognition, entity linking).[106]

In product design

LEGO allows users to work on new product designs while conducting requirements testing. Any user can provide a design for a product, and other users can vote on the product. Once the submitted product has received 10,000 votes, it will be formally reviewed in stages and go into production with no impediments such as legal flaws identified. The creator receives royalties from the net income.[107]

In business

Homeowners can use Airbnb to list their accommodation or unused rooms. Owners set their own nightly, weekly and monthly rates and accommodations. The business, in turn, charges guests and hosts a fee. Guests usually end up spending between $9 and $15.[108] They have to pay a booking fee every time they book a room. The landlord, in turn, pays a service fee for the amount due. The company has 1,500 properties in 34,000 cities in more than 190 countries.[citation needed]

In market research

Crowdsourcing is frequently used in market research as a way to gather insights and opinions from a large number of consumers.[109] Companies may create online surveys or focus groups that are open to the general public, allowing them to gather a diverse range of perspectives on their products or services. This can be especially useful for companies seeking to understand the needs and preferences of a particular market segment or to gather feedback on the effectiveness of their marketing efforts. The use of crowdsourcing in market research allows companies to quickly and efficiently gather a large amount of data and insights that can inform their business decisions.[110]

Other examples

  • GeographyVolunteered geographic information (VGI) is geographic information generated through crowdsourcing, as opposed to traditional methods of Professional Geographic Information (PGI).[111] In describing the built environment, VGI has many advantages over PGI, primarily perceived currency,[112] accuracy[113] and authority.[114] OpenStreetMap is an example of crowdsourced mapping project.[29]
  • Engineering — Many companies are introducing crowdsourcing to grow their engineering capabilities and find solutions to unsolved technical challenges and the need to adopt newest technologies such as 3D printing and the IOT.[citation needed]
  • Libraries, museums and archives — Newspaper text correction at the National Library of Australia was an early, influential example of work with text transcriptions for crowdsourcing in cultural heritage institutions.[115] The Steve Museum project provided a prototype for categorizing artworks.[116] Crowdsourcing is used in libraries for OCR corrections on digitized texts, for tagging and for funding, especially in the absence of financial and human means. Volunteers can contribute explicitly with conscious effort or implicitly without being known by turning the text on the raw newspaper image into human corrected digital form.[117]
  • Agriculture — Crowdsource research also applies to the field of agriculture. Crowdsourcing can be used to help farmers and experts to dentify different types of weeds[118] from the fields and also to provide assistance in removing the weeds.
  • Cheating in bridgeBoye Brogeland initiated a crowdsourcing investigation of cheating by top-level bridge players that showed several players as guilty, which led to their suspension.[119]
  • Open Source Software and Crowdsourcing software development have been used extensively in the domain of software development.
  • Healthcare — Research has emerged that outlined the use of crowdsourcing techniques in the public health domain.[120][121][122] The collective intelligence outcomes from crowdsourcing are being generated in three broad categories of public health care: health promotion,[121] health research,[123] and health maintenance.[124] Crowdsourcing also enables researchers to move from small homogeneous groups of participants to large heterogenous groups[125] beyond convenience samples such as students or higher educated people. The SESH group focuses on using crowdsourcing to improve health.

Methods

Internet and digital technologies have massively expanded the opportunities for crowdsourcing. However, the effect of user communication and platform presentation can have a major bearing on the success of an online crowdsourcing project.[18] The crowdsourced problem can range from huge tasks (such as finding alien life or mapping earthquake zones) or very small (identifying images). Some examples of successful crowdsourcing themes are problems that bug people, things that make people feel good about themselves, projects that tap into niche knowledge of proud experts, and subjects that people find sympathetic.[126]

Crowdsourcing can either take an explicit or an implicit route:

  • Explicit crowdsourcing lets users work together to evaluate, share, and build different specific tasks, while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing. With explicit crowdsourcing, users can evaluate particular items like books or webpages, or share by posting products or items. Users can also build artifacts by providing information and editing other people's work.[citation needed]
  • Implicit crowdsourcing can take two forms: standalone and piggyback. Standalone allows people to solve problems as a side effect of the task they are actually doing, whereas piggyback takes users' information from a third-party website to gather information.[127] This is also known as data donation.

In his 2013 book, Crowdsourcing, Daren C. Brabham puts forth a problem-based typology of crowdsourcing approaches:[128]

  • Knowledge discovery and management is used for information management problems where an organization mobilizes a crowd to find and assemble information. It is ideal for creating collective resources.
  • Distributed human intelligence tasking (HIT) is used for information management problems where an organization has a set of information in hand and mobilizes a crowd to process or analyze the information. It is ideal for processing large data sets that computers cannot easily do. Amazon Mechanical Turk uses this approach.
  • Broadcast search is used for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem that has an objective, provable right answer. It is ideal for scientific problem-solving.
  • Peer-vetted creative production is used for ideation problems, where an organization mobilizes a crowd to come up with a solution to a problem which has an answer that is subjective or dependent on public support. It is ideal for design, aesthetic, or policy problems.

Ivo Blohm identifies four types of Crowdsourcing Platforms: Microtasking, Information Pooling, Broadcast Search, and Open Collaboration. They differ in the diversity and aggregation of contributions that are created. The diversity of information collected can either be homogenous or heterogenous. The aggregation of information can either be selective or integrative.[definition needed][129] Some common categories of crowdsourcing have been used effectively in the commercial world include crowdvoting, crowdsolving, crowdfunding, microwork, creative crowdsourcing, crowdsource workforce management, and inducement prize contests.[130]

Crowdvoting

Crowdvoting occurs when a website gathers a large group's opinions and judgments on a certain topic. Some crowdsourcing tools and platforms allow participants to rank each other's contributions, e.g. in answer to the question "What is one thing we can do to make Acme a great company?" One common method for ranking is "like" counting, where the contribution with the most "like" votes ranks first. This method is simple and easy to understand, but it privileges early contributions, which have more time to accumulate votes.[citation needed] In recent years, several crowdsourcing companies have begun to use pairwise comparisons backed by ranking algorithms. Ranking algorithms do not penalize late contributions.[citation needed] They also produce results quicker. Ranking algorithms have proven to be at least 10 times faster than manual stack ranking.[131] One drawback, however, is that ranking algorithms are more difficult to understand than vote counting.

The Iowa Electronic Market is a prediction market that gathers crowds' views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes.[132] Some of the most famous examples have made use of social media channels: Domino's Pizza, Coca-Cola, Heineken, and Sam Adams have crowdsourced a new pizza, bottle design, beer, and song respectively.[133] A website called Threadless selected the T-shirts it sold by having users provide designs and vote on the ones they like, which are then printed and available for purchase.[17]

The California Report Card (CRC), a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society[134] and Lt. Governor Gavin Newsom, is an example of modern-day crowd voting. Participants access the CRC online and vote on six timely issues. Through principal component analysis, the users are then placed into an online "café" in which they can present their own political opinions and grade the suggestions of other participants. This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which people are most concerned.

Crowdvoting's value in the movie industry was shown when in 2009 a crowd accurately predicted the success or failure of a movie based on its trailer,[135][136] a feat that was replicated in 2013 by Google.[137]

On Reddit, users collectively rate web content, discussions and comments as well as questions posed to persons of interest in "AMA" and AskScience online interviews.[cleanup needed]

In 2017, Project Fanchise purchased a team in the Indoor Football League and created the Salt Lake Screaming Eagles, a fan run team. Using a mobile app, the fans voted on the day-to-day operations of the team, the mascot name, signing of players and even offensive play calling during games.[138]

Crowdfunding

Crowdfunding is the process of funding projects by a multitude of people contributing a small amount to attain a certain monetary goal, typically via the Internet.[139] Crowdfunding has been used for both commercial and charitable purposes.[140] The crowdfuding model that has been around the longest is rewards-based crowdfunding. This model is where people can prepurchase products, buy experiences, or simply donate. While this funding may in some cases go towards helping a business, funders are not allowed to invest and become shareholders via rewards-based crowdfunding.[141]

Individuals, businesses, and entrepreneurs can showcase their businesses and projects by creating a profile, which typically includes a short video introducing their project, a list of rewards per donation, and illustrations through images.[citation needed] Funders make monetary contribution for numerous reasons:

  1. They connect to the greater purpose of the campaign, such as being a part of an entrepreneurial community and supporting an innovative idea or product.[142]
  2. They connect to a physical aspect of the campaign like rewards and gains from investment.[142]
  3. They connect to the creative display of the campaign's presentation.
  4. They want to see new products before the public.[142]

The dilemma for equity crowdfunding in the US as of 2012 was during a refinement process for the regulations of the Securities and Exchange Commission, which had until 1 January 2013 to tweak the fundraising methods. The regulators were overwhelmed trying to regulate Dodd-Frank and all the other rules and regulations involving public companies and the way they traded. Advocates of regulation claimed that crowdfunding would open up the flood gates for fraud, called it the "wild west" of fundraising, and compared it to the 1980s days of penny stock "cold-call cowboys". The process allowed for up to $1 million to be raised without some of the regulations being involved. Companies under the then-current proposal would have exemptions available and be able to raise capital from a larger pool of persons, which can include lower thresholds for investor criteria, whereas the old rules required that the person be an "accredited" investor. These people are often recruited from social networks, where the funds can be acquired from an equity purchase, loan, donation, or ordering. The amounts collected have become quite high, with requests that are over a million dollars for software such as Trampoline Systems, which used it to finance the commercialization of their new software.[citation needed]

Inducement prize contests

Web-based idea competitions or inducement prize contests often consist of generic ideas, cash prizes, and an Internet-based platform to facilitate easy idea generation and discussion. An example of these competitions includes an event like IBM's 2006 "Innovation Jam", attended by over 140,000 international participants and yielded around 46,000 ideas.[143][144] Another example is the Netflix Prize in 2009. People were asked to come up with a recommendation algorithm that is more accurate than Netflix's current algorithm. It had a grand prize of US$1,000,000, and it was given to a team which designed an algorithm that beat Netflix's own algorithm for predicting ratings by 10.06%.[citation needed]

Another example of competition-based crowdsourcing is the 2009 DARPA balloon experiment, where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons. A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole, the winning team (MIT, in less than nine hours) established its own "collaborapetitive" environment to generate participation in their team.[145] A similar challenge was the Tag Challenge, funded by the US State Department, which required locating and photographing individuals in five cities in the US and Europe within 12 hours based only on a single photograph. The winning team managed to locate three suspects by mobilizing volunteers worldwide using a similar incentive scheme to the one used in the balloon challenge.[146]

Using open innovation platforms is an effective way to crowdsource people's thoughts and ideas for research and development. The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize that ranges from $10,000 to $100,000 per challenge.[17] InnoCentive, of Waltham, Massachusetts, and London, England, provides access to millions of scientific and technical experts from around the world. The company claims a success rate of 50% in providing successful solutions to previously unsolved scientific and technical problems. The X Prize Foundation creates and runs incentive competitions offering between $1 million and $30 million for solving challenges. Local Motors is another example of crowdsourcing, and it is a community of 20,000 automotive engineers, designers, and enthusiasts that compete to build off-road rally trucks.[147]

Implicit crowdsourcing

Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing, yet can still be very effective in completing certain tasks.[citation needed] Rather than users actively participating in solving a problem or providing information, implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user's actions.[17]

A good example of implicit crowdsourcing is the ESP game, where users find words to describe Google images, which are then used as metadata for the images. Another popular use of implicit crowdsourcing is through reCAPTCHA, which asks people to solve CAPTCHAs to prove they are human, and then provides CAPTCHAs from old books that cannot be deciphered by computers, to digitize them for the web. Like many tasks solved using the Mechanical Turk, CAPTCHAs are simple for humans, but often very difficult for computers.[127]

Piggyback crowdsourcing can be seen most frequently by websites such as Google that data-mine a user's search history and websites to discover keywords for ads, spelling corrections, and finding synonyms. In this way, users are unintentionally helping to modify existing systems, such as Google Ads.[45]

Other types

  • Creative crowdsourcing involves sourcing people for creative projects such as graphic design, crowdsourcing architecture, product design,[12] apparel design, movies,[148] writing, company naming,[149] illustration, etc.[150][151] While crowdsourcing competitions have been used for decades in some creative fields such as architecture, creative crowdsourcing has proliferated with the recent development of web-based platforms where clients can solicit a wide variety of creative work at lower cost than by traditional means.[citation needed]
  • Crowdshipping (crowd-shipping) is a peer-to-peer shipping service, usually conducted via an online platform or marketplace.[152] There are several methods that have been categorized as crowd-shipping:
    • Travelers heading in the direction of the buyer, and are willing to bring the package as part of their luggage for a reward.[153]
    • Truck drivers whose route lies along the buyer's location and who are willing to take extra items in their truck.[154]
    • Community-based platforms that connect international buyers and local forwarders, by allowing buyers to use forwarder's address as purchase destination, after which forwarders ship items further to the buyer.[155]
  • Crowdsolving is a collaborative and holistic way of solving a problem through many people, communities, groups, or resources. It is a type of crowdsourcing with focus on complex and intellectually demanding problems requiring considerable effort, and the quality or uniqueness of contribution.[156]
    • Problem–idea chains are a form of idea crowdsourcing and crowdsolving, where individuals are asked to submit ideas to solve problems and then problems that can be solved with those ideas. The aim is to find encourage individuals to find practical solutions to problems that are well thought through.[157]
  • Macrowork tasks typically have these characteristics: they can be done independently, they take a fixed amount of time, and they require special skills. Macro-tasks could be part of specialized projects or could be part of a large, visible project where workers pitch in wherever they have the required skills. The key distinguishing factors are that macro-work requires specialized skills and typically takes longer, while microwork requires no specialized skills.
  • Microwork is a crowdsourcing platform that allows users to do small tasks for which computers lack aptitude in for low amounts of money. Amazon's Mechanical Turk has created many different projects for users to participate in, where each task requires very little time and offers a very small amount in payment.[15] When choosing tasks, since only certain users "win", users learn to submit later and pick less popular tasks to increase the likelihood of getting their work chosen.[158] An example of a Mechanical Turk project is when users searched satellite images for a boat to find Jim Gray, a missing computer scientist.[127]
  • Mobile crowdsourcing involves activities that take place on smartphones or mobile platforms that are frequently characterized by GPS technology.[159] This allows for real-time data gathering and gives projects greater reach and accessibility. However, mobile crowdsourcing can lead to an urban bias, and can have safety and privacy concerns.[160][161][162]
  • Simple projects are those that require a large amount of time and skills compared to micro and macro-work. While an example of macro-work would be writing survey feedback, simple projects rather include activities like writing a basic line of code or programming a database, which both require a larger time commitment and skill level. These projects are usually not found on sites like Amazon Mechanical Turk, and are rather posted on platforms like Upwork that call for a specific expertise.[163]
  • Complex projects generally take the most time, have higher stakes, and call for people with very specific skills. These are generally "one-off" projects that are difficult to accomplish and can include projects such as designing a new product that a company hopes to patent. Such projects are considered to be complex because design is a meticulous process that requires a large amount of time to perfect, and people completing the project must have specialized training in design to effectively complete the project. These projects usually pay the highest, yet are rarely offered.[164]

Demographics of the crowd

The crowd is an umbrella term for the people who contribute to crowdsourcing efforts. Though it is sometimes difficult to gather data about the demographics of the crowd as a whole, several studies have examined various specific online platforms. Amazon Mechanical Turk has received a great deal of attention in particular. A study in 2008 by Ipeirotis found that users at that time were primarily American, young, female, and well-educated, with 40% earning more than $40,000 per year. In November 2009, Ross found a very different Mechanical Turk population where 36% of which was Indian. Two-thirds of Indian workers were male, and 66% had at least a bachelor's degree. Two-thirds had annual incomes less than $10,000, with 27% sometimes or always depending on income from Mechanical Turk to make ends meet.[165] More recent studies have found that U.S. Mechanical Turk workers are approximately 58% female, and nearly 67% of workers are in their 20s and 30s.[46][166][167][168] Close to 80% are White, and 9% are Black. MTurk workers are less likely to be married or have children as compared to the general population. In the US population over 18, 45% are unmarried, while the proportion of unmarried workers on MTurk is around 57%. Additionally, about 55% of MTurk workers do not have any children, which is significantly higher than the general population. Approximately 68% of U.S. workers are employed, compared to 60% in the general population. MTurk workers in the U.S. are also more likely to have a four-year college degree (35%) compared to the general population (27%). Politics within the U.S. sample of MTurk are skewed liberal, with 46% Democrats, 28% Republicans, and 26%  "other". MTurk workers are also less religious than the U.S. population, with 41% religious, 20% spiritual, 21% agnostic, and 16% atheist.

The demographics of Microworkers.com differ from Mechanical Turk in that the US and India together accounting for only 25% of workers; 197 countries are represented among users, with Indonesia (18%) and Bangladesh (17%) contributing the largest share. However, 28% of employers are from the US.[169]

Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white, middle- to upper-class, higher educated, worked in a so-called "white-collar job" and had a high-speed Internet connection at home.[170] In a crowd-sourcing diary study of 30 days in Europe, the participants were predominantly higher educated women.[125]

Studies have also found that crowds are not simply collections of amateurs or hobbyists. Rather, crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession.[170][171][172][173] Claiming that crowds are amateurs, rather than professionals, is both factually untrue and may lead to marginalization of crowd labor rights.[174]

Gregory Saxton et al. studied the role of community users, among other elements, during his content analysis of 103 crowdsourcing organizations. They developed a taxonomy of nine crowdsourcing models (intermediary model, citizen media production, collaborative software development, digital goods sales, product design, peer-to-peer social financing, consumer report model, knowledge base building model, and collaborative science project model) in which to categorize the roles of community users, such as researcher, engineer, programmer, journalist, graphic designer, etc., and the products and services developed.[175]

Motivations

Contributors

Many researchers suggest that both intrinsic and extrinsic motivations cause people to contribute to crowdsourced tasks and these factors influence different types of contributors.[93][170][171][173][176][177][178][179] For example, people employed in a full-time position rate human capital advancement as less important than part-time workers do, while women rate social contact as more important than men do.[177]

Intrinsic motivations are broken down into two categories: enjoyment-based and community-based motivations. Enjoyment-based motivations refer to motivations related to the fun and enjoyment contributors experience through their participation. These motivations include: skill variety, task identity, task autonomy, direct feedback from the job, and taking the job as a pastime.[citation needed] Community-based motivations refer to motivations related to community participation, and include community identification and social contact. In crowdsourced journalism, the motivation factors are intrinsic: the crowd is driven by a possibility to make social impact, contribute to social change, and help their peers.[176]

Extrinsic motivations are broken down into three categories: immediate payoffs, delayed payoffs, and social motivations. Immediate payoffs, through monetary payment, are the immediately received compensations given to those who complete tasks. Delayed payoffs are benefits that can be used to generate future advantages, such as training skills and being noticed by potential employers. Social motivations are the rewards of behaving pro-socially,[180] such as the altruistic motivations of online volunteers. Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to help researchers identify tumor cells, than when they were not told the purpose of their task. However, of those who completed the task, quality of output did not depend on the framing.[181]

Motivation in crowdsourcing is often a mix of intrinsic and extrinsic factors.[182] In a crowdsourced law-making project, the crowd was motivated by both intrinsic and extrinsic factors. Intrinsic motivations included fulfilling civic duty, affecting the law for sociotropic reasons, to deliberate with and learn from peers. Extrinsic motivations included changing the law for financial gain or other benefits. Participation in crowdsourced policy-making was an act of grassroots advocacy, whether to pursue one's own interest or more altruistic goals, such as protecting nature.[183] Participants in online research studies report their motivation as both intrinsic enjoyment and monetary gain.[184][185][167]

Another form of social motivation is prestige or status. The International Children's Digital Library recruited volunteers to translate and review books. Because all translators receive public acknowledgment for their contributions, Kaufman and Schulz cite this as a reputation-based strategy to motivate individuals who want to be associated with institutions that have prestige. The Mechanical Turk uses reputation as a motivator in a different sense, as a form of quality control. Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks, whereas workers who pay close attention may be rewarded by gaining access to higher-paying tasks or being on an "Approved List" of workers. This system may incentivize higher-quality work.[186] However, this system only works when requesters reject bad work, which many do not.[187]

Despite the potential global reach of IT applications online, recent research illustrates that differences in location[which?] affect participation outcomes in IT-mediated crowds.[188]

Limitations and controversies

At least six major topics cover the limitations and controversies about crowdsourcing:

  1. Impact of crowdsourcing on product quality
  2. Entrepreneurs contribute less capital themselves
  3. Increased number of funded ideas
  4. The value and impact of the work received from the crowd
  5. The ethical implications of low wages paid to workers
  6. Trustworthiness and informed decision making

Impact of crowdsourcing on product quality

Crowdsourcing allows anyone to participate, allowing for many unqualified participants and resulting in large quantities of unusable contributions. Companies, or additional crowdworkers, then have to sort through the low-quality contributions. The task of sorting through crowdworkers' contributions, along with the necessary job of managing the crowd, requires companies to hire actual employees, thereby increasing management overhead.[189] For example, susceptibility to faulty results can be caused by targeted, malicious work efforts. Since crowdworkers completing microtasks are paid per task, a financial incentive often causes workers to complete tasks quickly rather than well.[46] Verifying responses is time-consuming, so employers often depend on having multiple workers complete the same task to correct errors. However, having each task completed multiple times increases time and monetary costs.[190] Some companies, like CloudResearch, control data quality by repeatedly vetting crowdworkers to ensure they are paying attention and providing high-quality work.[187]

Crowdsourcing quality is also impacted by task design. Lukyanenko et al.[191] argue that, the prevailing practice of modeling crowdsourcing data collection tasks in terms of fixed classes (options), unnecessarily restricts quality. Results demonstrate that information accuracy depends on the classes used to model domains, with participants providing more accurate information when classifying phenomena at a more general level (which is typically less useful to sponsor organizations, hence less common).[clarification needed] Further, greater overall accuracy is expected when participants could provide free-form data compared to tasks in which they select from constrained choices. In behavioral science research, it is often recommended to include open-ended responses, in addition to other forms of attention checks, to assess data quality.[192][193]

Just as limiting, oftentimes there is not enough skills or expertise in the crowd to successfully accomplish the desired task. While this scenario does not affect "simple" tasks such as image labeling, it is particularly problematic for more complex tasks, such as engineering design or product validation. A comparison between the evaluation of business models from experts and an anonymous online crowd showed that an anonymous online crowd cannot evaluate business models to the same level as experts.[194] In these cases, it may be difficult or even impossible to find qualified people in the crowd, as their responses represent only a small fraction of the workers compared to consistent, but incorrect crowd members.[195] However, if the task is "intermediate" in its difficulty, estimating crowdworkers' skills and intentions and leveraging them for inferring true responses works well,[196] albeit with an additional computation cost.[citation needed]

Crowdworkers are a nonrandom sample of the population. Many researchers use crowdsourcing to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable. However, due to limited access to the Internet, participation in low developed countries is relatively low. Participation in highly developed countries is similarly low, largely because the low amount of pay is not a strong motivation for most users in these countries. These factors lead to a bias in the population pool towards users in medium developed countries, as deemed by the human development index.[197] Participants in these countries sometimes masquerade as U.S. participants to gain access to certain tasks. This led to the "bot scare" on Amazon Mechanical Turk in 2018, when researchers thought bots were completing research surveys due to the lower quality of responses originating from medium-developed countries.[193][198]

The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project. Tasks that are not completed quickly may be forgotten, buried by filters and search procedures. This results in a long-tail power law distribution of completion times.[199] Additionally, low-paying research studies online have higher rates of attrition, with participants not completing the study once started.[47] Even when tasks are completed, crowdsourcing does not always produce quality results. When Facebook began its localization program in 2008, it encountered some criticism for the low quality of its crowdsourced translations.[200] One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client. Usually little information is known about the final product, and workers rarely interacts with the final client in the process. This can decrease the quality of product as client interaction is considered to be a vital part of the design process.[201]

An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools. In a typical workplace, coworkers are organized in such a way that they can work together and build upon each other's knowledge and ideas. Furthermore, the company often provides employees with the necessary information, procedures, and tools to fulfill their responsibilities. However, in crowdsourcing, crowd-workers are left to depend on their own knowledge and means to complete tasks.[189]

A crowdsourced project is usually expected to be unbiased by incorporating a large population of participants with a diverse background. However, most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome (e.g. most of open source projects working on Linux). In many other cases, the end product is the outcome of a single person's endeavor, who creates the majority of the product, while the crowd only participates in minor details.[202]

Entrepreneurs contribute less capital themselves

To make an idea turn into a reality, the first component needed is capital. Depending on the scope and complexity of the crowdsourced project, the amount of necessary capital can range from a few thousand dollars to hundreds of thousands, if not more. The capital-raising process can take from days to months depending on different variables, including the entrepreneur's network and the amount of initial self-generated capital.[citation needed]

The crowdsourcing process allows entrepreneurs to access a wide range of investors who can take different stakes in the project.[203] As an effect, crowdsourcing simplifies the capital-raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started. Overall, the simplified access to capital can save time to start projects and potentially increase the efficiency of projects.[citation needed]

Others argue that easier access to capital through a large number of smaller investors can hurt the project and its creators. With a simplified capital-raising process involving more investors with smaller stakes, investors are more risk-seeking because they can take on an investment size with which they are comfortable.[203] This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project. Instead of being forced to assess risks and convince large institutional investors on why their project can be successful, wary investors can be replaced by others who are willing to take on the risk.

Some translation companies and translation tool consumers pretend to use crowdsourcing as a means for drastically cutting costs, instead of hiring professional translators. This situation has been systematically denounced by IAPTI and other translator organizations.[204]

Increased number of funded ideas

The raw number of ideas that get funded and the quality of the ideas is a large controversy over the issue of crowdsourcing.

Proponents argue that crowdsourcing is beneficial because it allows the formation of startups with niche ideas that would not survive venture capitalist or angel funding, which areoftentimes the primary investors in startups. Many ideas are scrapped in their infancy due to insufficient support and lack of capital, but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project.[205]

Crowdsourcing allows those who would benefit from the project to fund and become a part of it, which is one way for small niche ideas get started.[206] However, when the number of projects grows, the number of failures also increases. Crowdsourcing assists the development of niche and high-risk projects due to a perceived need from a select few who seek the product. With high risk and small target markets, the pool of crowdsourced projects faces a greater possible loss of capital, lower return, and lower levels of success.[207]

Concerns

Because crowdworkers are considered independent contractors rather than employees, they are not guaranteed minimum wage. In practice, workers using Amazon Mechanical Turk generally earn less than minimum wage. In 2009, it was reported that United States Turk users earned an average of $2.30 per hour for tasks, while users in India earned an average of $1.58 per hour, which is below minimum wage in the United States (but not in India).[165][208] In 2018, a survey of 2,676 Amazon Mechanical Turk workers doing 3.8 million tasks found that the median hourly wage was approximately $2 per hour, and only 4% of workers earned more than the federal minimum wage of $7.25 per hour.[209] Some researchers who have considered using Mechanical Turk to get participants for research studies have argued that the wage conditions might be unethical.[47][210] However, according to other research, workers on Amazon Mechanical Turk do not feel they are exploited and are ready to participate in crowdsourcing activities in the future.[211] A more recent study using stratified random sampling to access a representative sample of Mechanical Turk workers found that the U.S. MTurk population is financially similar to the general population.[167] Workers tend to participate in tasks as a form of paid leisure and to supplement their primary income, and only 7% view it as a full-time job. Overall, workers rated MTurk as less stressful than other jobs. Workers also earn more than previously reported, about $6.50 per hour. They see MTurk as part of the solution to their financial situation and report rare upsetting experiences. They also perceive requesters on MTurk as fairer and more honest than employers outside of the platform.[167]

When Facebook began its localization program in 2008, it received criticism for using free labor in crowdsourcing the translation of site guidelines.[200]

Typically, no written contracts, nondisclosure agreements, or employee agreements are made with crowdworkers. For users of the Amazon Mechanical Turk, this means that employers decide whether users' work is acceptable and reserve the right to withhold pay if it does not meet their standards.[212] Critics say that crowdsourcing arrangements exploit individuals in the crowd, and a call has been made for crowds to organize for their labor rights.[213][174][214]

Collaboration between crowd members can also be difficult or even discouraged, especially in the context of competitive crowd sourcing. Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems; only 10.6% of respondents reported working in a team on their submission.[171] Amazon Mechanical Turk workers collaborated with academics to create a platform, WeAreDynamo.org, that allows them to organize and create campaigns to better their work situation, but unfortunately the site is no longer running.[215] Another platform run by Amazon Mechanical Turk workers and academics, Turkopticon, continues to operate and provides worker reviews on Amazon Mechanical Turk employers.[216]

America Online settled the case Hallissey et al v. America Online, Inc. for $15 million in 2009, after unpaid moderators sued to be paid the minimum wage as employees under the U.S. Fair Labor Standards Act.

See also

References

  1. ^ Schenk, Eric; Guittard, Claude (1 January 2009). "Crowdsourcing What can be Outsourced to the Crowd and Why". Center for Direct Scientific Communication. Retrieved 1 October 2018 – via HAL. {{cite journal}}: Cite journal requires |journal= (help)
  2. ^ Hirth, Matthias; Hoßfeld, Tobias; Tran-Gia, Phuoc (2011). (PDF). 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing. pp. 322–329. doi:10.1109/IMIS.2011.89. ISBN 978-1-61284-733-7. S2CID 12955095. Archived from the original (PDF) on 22 November 2015. Retrieved 5 September 2015.
  3. ^ a b Estellés-Arolas, Enrique; González-Ladrón-de-Guevara, Fernando (2012), "Towards an Integrated Crowdsourcing Definition" (PDF), Journal of Information Science, 38 (2): 189–200, doi:10.1177/0165551512437638, hdl:10251/56904, S2CID 18535678
  4. ^ Brabham, D. C. (2013). Crowdsourcing. Cambridge, Massachusetts; London, England: The MIT Press.
  5. ^ Brabham, D. C. (2008). "Crowdsourcing as a Model for Problem Solving an Introduction and Cases". Convergence: The International Journal of Research into New Media Technologies. 14 (1): 75–90. CiteSeerX 10.1.1.175.1623. doi:10.1177/1354856507084420. S2CID 145310730.
  6. ^ Prpić, J., & Shukla, P. (2016). Crowd Science: Measurements, Models, and Methods. In Proceedings of the 49th Annual Hawaii International Conference on System Sciences, Kauai, Hawaii: IEEE Computer Society
  7. ^ Buettner, Ricardo (2015). A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective. 48th Annual Hawaii International Conference on System Sciences. Kauai, Hawaii: IEEE. pp. 4609–4618. doi:10.13140/2.1.2061.1845. ISBN 978-1-4799-7367-5.
  8. ^ a b Prpić, John; Taeihagh, Araz; Melton, James (September 2015). "The Fundamentals of Policy Crowdsourcing". Policy & Internet. 7 (3): 340–361. arXiv:1802.04143. doi:10.1002/poi3.102. S2CID 3626608.
  9. ^ Afuah, A.; Tucci, C. L. (2012). "Crowdsourcing as a Solution to Distant Search". Academy of Management Review. 37 (3): 355–375. doi:10.5465/amr.2010.0146.
  10. ^ de Vreede, T., Nguyen, C., de Vreede, G. J., Boughzala, I., Oh, O., & Reiter-Palmon, R. (2013). A Theoretical Model of User Engagement in Crowdsourcing. In Collaboration and Technology (pp. 94-109). Springer Berlin Heidelberg
  11. ^ Sarin, Supheakmungkol; Pipatsrisawat, Knot; Pham, Khiêm; Batra, Anurag; Valente, Luis (2019). "Crowdsource by Google: A Platform for Collecting Inclusive and Representative Machine Learning Data" (PDF). AAAI Hcomp 2019.
  12. ^ a b Liu, Wei; Moultrie, James; Ye, Songhe (4 May 2019). "The Customer-Dominated Innovation Process: Involving Customers as Designers and Decision-Makers in Developing New Product". The Design Journal. 22 (3): 299–324. doi:10.1080/14606925.2019.1592324. ISSN 1460-6925. S2CID 145931864.
  13. ^ Schlagwein, Daniel; Bjørn-Andersen, Niels (2014), "Organizational Learning with Crowdsourcing: The Revelatory Case of LEGO" (PDF), Journal of the Association for Information Systems, 15 (11): 754–778, doi:10.17705/1jais.00380
  14. ^ Taeihagh, Araz (19 June 2017). "Crowdsourcing, Sharing Economies, and Development". Journal of Developing Societies. 33 (2): 0169796X1771007. arXiv:1707.06603. doi:10.1177/0169796x17710072. S2CID 32008949.
  15. ^ a b c Howe, Jeff (2006). "The Rise of Crowdsourcing". Wired.
  16. ^ Howe, Jeff (2 June 2006). "Crowdsourcing: A Definition". Crowdsourcing Blog. Retrieved 2 January 2013.
  17. ^ a b c d e Brabham, Daren (2008), (PDF), Convergence: The International Journal of Research into New Media Technologies, 14 (1): 75–90, CiteSeerX 10.1.1.175.1623, doi:10.1177/1354856507084420, S2CID 145310730, archived from the original (PDF) on 2 August 2012
  18. ^ a b Guth, Kristen L.; Brabham, Daren C. (4 August 2017). "Finding the diamond in the rough: Exploring communication and platform in crowdsourcing performance". Communication Monographs. 84 (4): 510–533. doi:10.1080/03637751.2017.1359748. ISSN 0363-7751. S2CID 54045924.
  19. ^ a b c d e f g h . Crowdsourcing.org. 18 March 2012. Archived from the original on 3 July 2015. Retrieved 2 July 2015.
  20. ^ Hern, Chester G.(2002). Tracks in the Sea, p. 123 & 246. McGraw Hill. ISBN 0-07-136826-4.
  21. ^ "Smithsonian Crowdsourcing Since 1849". Smithsonian Institution Archives. 14 April 2011. Retrieved 24 August 2018.
  22. ^ Clark, Catherine E. (25 April 1970). "'C'était Paris en 1970'". Études Photographiques (31). Retrieved 2 July 2015.
  23. ^ Axelrod R. (1980), "'Effective choice in the Prisoner's Dilemma'", Journal of Conflict Resolution, 24 (1): 3−25, doi:10.1177/002200278002400101, S2CID 143112198
  24. ^ . Onlinevolunteering.org. Archived from the original on 2 July 2015. Retrieved 2 July 2015.
  25. ^ "Wired 14.06: The Rise of Crowdsourcing". Archive.wired.com. 4 January 2009. Retrieved 2 July 2015.
  26. ^ Lih, Andrew (2009). The Wikipedia revolution: how a bunch of nobodies created the world's greatest encyclopedia (1st ed.). New York: Hyperion. ISBN 978-1401303716.
  27. ^ Lakhani KR, Garvin DA, Lonstein E (January 2010). "TopCoder (A): Developing Software through Crowdsourcing". Harvard Business School Case: 610–032.
  28. ^ Phadnisi, Shilpa (21 October 2016). "Appirio's TopCoder too is a big catch for Wipro". The Times of India. Retrieved 30 April 2018.
  29. ^ a b "For The Love Of Open Mapping Data". TechCrunch. 9 August 2014. Retrieved 23 July 2019.
  30. ^ a b . Archived from the original on 29 November 2014.[better source needed]
  31. ^ "Amazon Mechanical Turk". www.mturk.com. Retrieved 25 November 2022.
  32. ^ Garrigos-Simon, Fernando J.; Gil-Pechuán, Ignacio; Estelles-Miguel, Sofia (2015). Advances in Crowdsourcing. Springer. ISBN 9783319183411.
  33. ^ "Antoine-Jean-Baptiste-Robert Auget, Baron de Montyon". New Advent. Retrieved 25 February 2012.
  34. ^ "It Was All About Alkali". Chemistry Chronicles. Retrieved 25 February 2012.
  35. ^ "Nicolas Appert". John Blamire. Retrieved 25 February 2012.
  36. ^ "9 Examples of Crowdsourcing, Before 'Crowdsourcing' Existed". MemeBurn. 15 September 2011. Retrieved 25 February 2012.
  37. ^ Pande, Shamni (25 May 2013). "The People Know Best". Business Today. India: Living Media India Limited.
  38. ^ Noveck, Beth Simone (2009), Wiki Government: How Technology Can Make Government Better, Democracy Stronger, and Citizens More Powerful, Brookings Institution Press
  39. ^ Sarasua, Cristina; Simperl, Elena; Noy, Natalya F. (2012), "Crowdsourcing Ontology Alignment with Microtasks" (PDF), Institute AIFB. Karlsruhe Institute of Technology: 2
  40. ^ Hollow, Matthew (20 April 2013). "Crowdfunding and Civic Society in Europe: A Profitable Partnership?". Open Citizenship. Retrieved 29 April 2013.
  41. ^ , U.S. Department of Transportation, archived from the original on 7 January 2009
  42. ^ Peer-to-Patent Community Patent Review Project, Peer-to-Patent Community Patent Review Project
  43. ^ Callison-Burch, C.; Dredze, M. (2010), (PDF), Human Language Technologies Conference: 1–12, archived from the original (PDF) on 2 August 2012, retrieved 28 February 2012
  44. ^ McGraw, I.; Seneff, S. (2011), "Growing a Spoken Language Interface on Amazon Mechanical Turk" (PDF), Interspeech: 3057–3060, doi:10.21437/Interspeech.2011-765
  45. ^ a b Kittur, A.; Chi, E.H.; Sun, B. (2008), "Crowdsourcing user studies with Mechanical Turk" (PDF), Chi 2008
  46. ^ a b c d Litman, Leib; Robinson, Jonathan (2020). Conducting Online Research on Amazon Mechanical Turk and Beyond. SAGE Publications. ISBN 978-1506391137.
  47. ^ a b c Mason, W.; Suri, S. (2010), "Conducting Behavioral Research on Amazon's Mechanical Turk", Behavior Research Methods, SSRN 1691163
  48. ^ Koblin, A. (2008), "The sheep market", Creativity and Cognition: 451–452, doi:10.1145/1640233.1640348, ISBN 9781605588650, S2CID 20609292
  49. ^ "explodingdog 2015". Explodingdog.com. Retrieved 2 July 2015.
  50. ^ DeVun, Leah (19 November 2009). . Wired News. Archived from the original on 24 October 2012. Retrieved 26 February 2012.
  51. ^ Linver, D. (2010), , archived from the original on 14 July 2014, retrieved 28 February 2012
  52. ^ . INRIX.com. 13 September 2014. Archived from the original on 12 October 2014. Retrieved 2 July 2015.
  53. ^ Vergano, Dan (30 August 2014). "1833 Meteor Storm Started Citizen Science". National Geographic. StarStruck. Retrieved 18 September 2014.
  54. ^ "Gateway to Astronaut Photography of Earth". NASA.
  55. ^ McLaughlin, Elliot. "Image Overload: Help us sort it all out, NASA requests". Cnn.com. CNN. Retrieved 18 September 2014.
  56. ^ Liu, Huiying; Xie, Qian Wen; Lou, Vivian W. Q. (1 April 2019). "Everyday social interactions and intra-individual variability in affect: A systematic review and meta-analysis of ecological momentary assessment studies". Motivation and Emotion. 43 (2): 339–353. doi:10.1007/s11031-018-9735-x. ISSN 1573-6644. S2CID 254827087.
  57. ^ Luong, Raymond; Lomanowska, Anna M. (2021). "Evaluating Reddit as a Crowdsourcing Platform for Psychology Research Projects". Teaching of Psychology. 49 (4): 329–337. doi:10.1177/00986283211020739. ISSN 0098-6283. S2CID 236414676.
  58. ^ Brown, Joshua K.; Hohman, Zachary P. (2022). "Extreme party animals: Effects of political identification and ideological extremity". Journal of Applied Social Psychology. 52 (5): 351–362. doi:10.1111/jasp.12863. ISSN 0021-9029. S2CID 247077069.
  59. ^ Vaterlaus, J. Mitchell; Patten, Emily V.; Spruance, Lori A. (26 May 2022). "#Alonetogether:: An Exploratory Study of Social Media Use at the Beginning of the COVID-19 Pandemic". The Journal of Social Media in Society. 11 (1): 27–45. ISSN 2325-503X.
  60. ^ Després, Jacques; Hadjsaid, Nouredine; Criqui, Patrick; Noirot, Isabelle (1 February 2015). "Modelling the impacts of variable renewable sources on the power sector: reconsidering the typology of energy modelling tools". Energy. 80: 486–495. doi:10.1016/j.energy.2014.12.005. ISSN 0360-5442.
  61. ^ "OpenEI — Energy Information, Data, and other Resources". OpenEI. Retrieved 26 September 2016.
  62. ^ Garvin, Peggy (12 December 2009). "New Gateway: Open Energy Info". SLA Government Information Division. Dayton, Ohio, USA. Retrieved 26 September 2016.[permanent dead link]
  63. ^ Brodt-Giles, Debbie (2012). (PDF). Golden, Colorado, USA: National Renewable Energy Laboratory (NREL). Archived from the original (PDF) on 9 October 2016. Retrieved 24 September 2016.
  64. ^ Davis, Chris; Chmieliauskas, Alfredas; Dijkema, Gerard; Nikolic, Igor. "Enipedia". Delft, The Netherlands: Energy and Industry group, Faculty of Technology, Policy and Management, TU Delft. Archived from the original on 10 June 2014. Retrieved 7 October 2016.{{cite web}}: CS1 maint: unfit URL (link)
  65. ^ Davis, Chris (2012). Making sense of open data: from raw data to actionable insight — PhD thesis. Delft, The Netherlands: Delft University of Technology. Retrieved 2 October 2018.Chapter 9 discusses in depth the initial development of Enipedia.
  66. ^ "What Is the Four-Generation Program?". The Church of Jesus Christ of Latter-day Saints. Retrieved 30 January 2012.
  67. ^ King, Turi E.; Jobling, Mark A. (2009). "What's in a name? Y chromosomes, surnames and the genetic genealogy revolution". Trends in Genetics. 25 (8): 351–60. doi:10.1016/j.tig.2009.06.003. hdl:2381/8106. PMID 19665817. The International Society of Genetic Genealogy (www.isogg.org) advocates the use of genetics as a tool for genealogical research, and provides a support network for genetic genealogists. It hosts the ISOGG Y-haplogroup tree, which has the virtue of being regularly updated.
  68. ^ Mendex, etc. al., Fernando (28 February 2013). "An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree". The American Journal of Human Genetics. 92 (3): 454–459. doi:10.1016/j.ajhg.2013.02.002. PMC 3591855. PMID 23453668.
  69. ^ Wells, Spencer (2013). . Southern California Genealogical Society (SCGS). Archived from the original on 10 July 2013. Retrieved 10 July 2013.
  70. ^ "History of the Christmas Bird Count | Audubon". Birds.audubon.org. 22 January 2015. Retrieved 2 July 2015.
  71. ^ . Audubon. 5 October 2017. Archived from the original on 24 August 2014.
  72. ^ (PDF). iscram2015.uia.no. Archived from the original (PDF) on 17 October 2016. Retrieved 14 October 2016.
  73. ^ Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change and Peer-Learning". International Journal of Communication. 9: 3523–3543.
  74. ^ Aitamurto, Tanja (2016). "Crowdsourcing as a Knowledge-Search Method in Digital Journalism: Ruptured Ideals and Blended Responsibility". Digital Journalism. 4 (2): 280–297. doi:10.1080/21670811.2015.1034807. S2CID 156243124.
  75. ^ Aitamurto, Tanja (2013). "Balancing between open and closed: co-creation in magazine journalism". Digital Journalism. 1 (2): 229–251. doi:10.1080/21670811.2012.750150. S2CID 62882093.
  76. ^ "Algorithm Watch". Algorithm Watch. 2022. Retrieved 18 May 2022.
  77. ^ "Overview in English". DataSkop. 2022. Retrieved 18 May 2022.
  78. ^ "FAQs". Mozilla Rally. Retrieved 14 March 2023. Mozilla Rally is currently available to US residents who are age 19 and older
  79. ^ "It's your data. Use it for change". Mozilla Rally. Retrieved 14 March 2023.
  80. ^ Angus, Daniel (16 February 2022). "A data economy: the case for doing and knowing more about algorithms". Crikey. Retrieved 24 March 2022.
  81. ^ Burgess, Jean; Angus, Daniel; Carah, Nicholas; Andrejevic, Mark; Hawker, Kiah; Lewis, Kelly; Obeid, Abdul; Smith, Adam; Tan, Jane; Fordyce, Robbie; Trott, Verity (8 November 2021). "Critical simulation as hybrid digital method for exploring the data operations and vernacular cultures of visual social media platforms". SocArXiv. doi:10.31235/osf.io/2cwsu. S2CID 243837581.
  82. ^ The Markup (2022). "The Citizen Browser Project—Auditing the Algorithms of Disinformation". The Markup. Retrieved 18 May 2022.
  83. ^ Smith, Graham; Richards, Robert C.; Gastil, John (12 May 2015). "The Potential ofParticipediaas a Crowdsourcing Tool for Comparative Analysis of Democratic Innovations" (PDF). Policy & Internet. 7 (2): 243–262. doi:10.1002/poi3.93. ISSN 1944-2866.
  84. ^ Moon, M. Jae (2018). "Evolution of co-production in the information age: crowdsourcing as a model of web-based co-production in Korea". Policy and Society. 37 (3): 294–309. doi:10.1080/14494035.2017.1376475. ISSN 1449-4035. S2CID 158440300.
  85. ^ Taeihagh, Araz (8 November 2017). "Crowdsourcing: a new tool for policy-making?". Policy Sciences. 50 (4): 629–647. arXiv:1802.03113. doi:10.1007/s11077-017-9303-3. ISSN 0032-2687. S2CID 27696037.
  86. ^ Diamond, Larry; Whittington, Zak (2009). "Social Media". In Welzel, Christian; Haerpfer, Christian W.; Bernhagen, Patrick; Inglehart, Ronald F. (eds.). Democratization (2 ed.). Oxford: Oxford University Press (published 2018). p. 256. ISBN 9780198732280. Retrieved 4 March 2021. Another way that social media can contribute to democratization is by 'crowdsourcing' information. This elicits the knowledge and wisdom of the 'crowd' [...].
  87. ^ Aitamurto, Tanja (2012). Crowdsourcing for Democracy: New Era In Policy–Making. Committee for the Future, Parliament of Finland. pp. 10–30. ISBN 978-951-53-3459-6.
  88. ^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). (PDF). Humancomputation.com. Archived from the original (PDF) on 24 June 2015. Retrieved 2 July 2015.
  89. ^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). "A Framework for Policy Crowdsourcing. Oxford Internet Institute, University of Oxford - IPP 2014 - Crowdsourcing for Politics and Policy" (PDF). Ipp.oxii.ox.ac.uk. Retrieved 2 October 2018.
  90. ^ Prpić, J.; Taeihagh, A.; Melton, J. (2014). "Experiments on Crowdsourcing Policy Assessment. Oxford Internet Institute, University of Oxford - IPP 2014 - Crowdsourcing for Politics and Policy" (PDF). Ipp.oii.ox.ac.uk. Retrieved 2 July 2015.
  91. ^ Thapa, B.; Niehaves, B.; Seidel, C.; Plattfaut, R. (2015). "Citizen involvement in public sector innovation: Government and citizen perspectives". Information Polity. 20 (1): 3–17. doi:10.3233/IP-150351.
  92. ^ Aitamurto and Landemore (4 February 2015). "Five design principles for crowdsourced policymaking: Assessing the case of crowdsourced off-road traffic law reform in Finland". Journal of Social Media for Organizations (1): 1–19.
  93. ^ a b Aitamurto, Landemore, Galli (2016). "Unmasking the Crowd: Participants' Motivation Factors, Profile and Expectations for Participation in Crowdsourced Policymaking". Information, Communication & Society. 20 (8): 1239–1260. doi:10.1080/1369118x.2016.1228993. S2CID 151989757 – via Routledge.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  94. ^ Aitamurto, Chen, Cherif, Galli and Santana (2016). "Making Sense of Crowdsourced Civic Data with Big Data Tools". ACM Digital Archive: Academic Mindtrek. doi:10.1145/2994310.2994366. S2CID 16855773 – via ACM Digital Archive.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  95. ^ a b c Aitamurto, Tanja (31 January 2015). Crowdsourcing for Democracy: New Era in Policymaking. Committee for the Future, Parliament of Finland. ISBN 978-951-53-3459-6.
  96. ^ "Home". challenge.gov.
  97. ^ Stan Nussbaum. 2003. Proverbial perspectives on pluralism. Connections: the journal of the WEA Missions Committee October, pp. 30, 31.
  98. ^ "Oromo dictionary project". OromoDictionary.com. Retrieved 3 February 2014.
  99. ^ Albright, Eric; Hatton, John (2007). Chapter 10. WeSay, a Tool for Engaging Native Speakers in Dictionary Building. hdl:10125/1368. ISBN 978-0-8248-3309-1.
  100. ^ "Developing ASL vocabulary for science and math". Washington.edu. 7 December 2012. Retrieved 3 February 2014.
  101. ^ Keuleers; et al. (February 2015). "Word knowledge in the crowd: Measuring vocabulary size and word prevalence in a massive online experiment". Quarterly Journal of Experimental Psychology. 68 (8): 1665–1692. doi:10.1080/17470218.2015.1022560. PMID 25715025. S2CID 4894686.
  102. ^ a b Bill, Jeremiah; Gong, He; Hamilton, Brooke; Hawthorn, Henry; et al. "The extension of (positive) anymore". Google Docs. Retrieved 27 September 2020.
  103. ^ . AfghanProverbs.com. Archived from the original on 4 February 2014. Retrieved 3 February 2014.
  104. ^ "Comparing methods of collecting proverbs" (PDF). gial.edu.
  105. ^ Edward Zellem. 2014. Mataluna: 151 Afghan Pashto Proverbs. Tampa, Florida: Culture Direct.
  106. ^ Zhai, Haijun; Lingren, Todd; Deleger, Louise; Li, Qi; Kaiser, Megan; Stoutenborough, Laura; Solti, Imre (2013). "Web 2.0-based crowdsourcing for high-quality gold standard development in clinical Natural Language Processing". Journal of Medical Internet Research. 15 (4): e73. doi:10.2196/jmir.2426. PMC 3636329. PMID 23548263.
  107. ^ Martin, Fred; Resnick, Mitchel (1993), "LEGO/Logo and Electronic Bricks: Creating a Scienceland for Children", Advanced Educational Technologies for Mathematics and Science, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 61–89, doi:10.1007/978-3-662-02938-1_2, ISBN 978-3-642-08152-1, retrieved 26 July 2022
  108. ^ Reinhold, Stephan; Dolnicar, Sara (December 2017), "How Airbnb Creates Value", Peer-to-Peer Accommodation Networks, Goodfellow Publishers, doi:10.23912/9781911396512-3602, ISBN 9781911396512, retrieved 26 July 2022
  109. ^ "Prime Panels by CloudResearch | Online Research Panel Recruitment". CloudResearch. Retrieved 12 January 2023.
  110. ^ Nunan, Daniel (2020). Marketing research : applied insight. David F. Birks, Naresh K. Malhotra (6th ed.). Harlow, United Kingdom. ISBN 978-1-292-30872-2. OCLC 1128061550.
  111. ^ Parker, Christopher J.; May, Andrew; Mitchell, Val (November 2013). "The role of VGI and PGI in supporting outdoor activities". Applied Ergonomics. 44 (6): 886–894. doi:10.1016/j.apergo.2012.04.013. ISSN 0003-6870. PMID 22795180. S2CID 12918341.
  112. ^ Parker, Christopher J.; May, Andrew; Mitchell, Val (15 May 2014). "User-centred design of neogeography: the impact of volunteered geographic information on users' perceptions of online map 'mashups'". Ergonomics. 57 (7): 987–997. doi:10.1080/00140139.2014.909950. ISSN 0014-0139. PMID 24827070. S2CID 13458260.
  113. ^ Brown, Michael; Sharples, Sarah; Harding, Jenny; Parker, Christopher J. (November 2013). (PDF). Applied Ergonomics. 44 (6): 855–865. doi:10.1016/j.apergo.2012.10.013. PMID 23177775. S2CID 26412254. Archived from the original (PDF) on 19 July 2018. Retrieved 20 August 2019.
  114. ^ Parker, Christopher J.; May, Andrew; Mitchell, Val (August 2012). "Understanding Design with VGI using an Information Relevance Framework". Transactions in GIS. 16 (4): 545–560. doi:10.1111/j.1467-9671.2012.01302.x. ISSN 1361-1682. S2CID 20100267.
  115. ^ Holley, Rose (March 2010). "Crowdsourcing: How and Why Should Libraries Do It?". D-Lib Magazine. 16 (3/4). doi:10.1045/march2010-holley. Retrieved 21 May 2021.
  116. ^ Trant, Jennifer (2009). (PDF). Archives & Museum Informatics. Archived from the original (PDF) on 10 February 2010. Retrieved 21 May 2021.
  117. ^ Andro, M. (2018). Digital libraries and crowdsourcing, Wiley / ISTE. ISBN 9781786301611.
  118. ^ Rahman, Mahbubur; Blackwell, Brenna; Banerjee, Nilanjan; Dharmendra, Saraswat (2015), "Smartphone-based hierarchical crowdsourcing for weed identification", Computers and Electronics in Agriculture, 113: 14–23, doi:10.1016/j.compag.2014.12.012, retrieved 12 August 2015
  119. ^ Primarily on the Bridge Winners website
  120. ^ Tang, Weiming; Han, Larry; Best, John; Zhang, Ye; Mollan, Katie; Kim, Julie; Liu, Fengying; Hudgens, Michael; Bayus, Barry (1 June 2016). "Crowdsourcing HIV Test Promotion Videos: A Noninferiority Randomized Controlled Trial in China". Clinical Infectious Diseases. 62 (11): 1436–1442. doi:10.1093/cid/ciw171. ISSN 1537-6591. PMC 4872295. PMID 27129465.
  121. ^ a b Zhang, Ye; Kim, Julie A.; Liu, Fengying; Tso, Lai Sze; Tang, Weiming; Wei, Chongyi; Bayus, Barry L.; Tucker, Joseph D. (November 2015). "Creative Contributory Contests to Spur Innovation in Sexual Health: 2 Cases and a Guide for Implementation". Sexually Transmitted Diseases. 42 (11): 625–628. doi:10.1097/OLQ.0000000000000349. ISSN 1537-4521. PMC 4610177. PMID 26462186.
  122. ^ Créquit, Perrine (2018). "Mapping of Crowdsourcing in Health: Systematic Review". Journal of Medical Internet Research. 20 (5): e187. doi:10.2196/jmir.9330. PMC 5974463. PMID 29764795.
  123. ^ van der Krieke; et al. (2015). "HowNutsAreTheDutch (HoeGekIsNL): A crowdsourcing study of mental symptoms and strengths" (PDF). International Journal of Methods in Psychiatric Research. 25 (2): 123–144. doi:10.1002/mpr.1495. PMC 6877205. PMID 26395198.
  124. ^ Prpić, J. (2015). "Health Care Crowds: Collective Intelligence in Public Health. Collective Intelligence 2015. Center for the Study of Complex Systems, University of Michigan". Papers.ssrn.com. SSRN 2570593. {{cite journal}}: Cite journal requires |journal= (help)
  125. ^ a b van der Krieke, L; Blaauw, FJ; Emerencia, AC; Schenk, HM; Slaets, JP; Bos, EH; de Jonge, P; Jeronimus, BF (2016). "Temporal Dynamics of Health and Well-Being: A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback (2016)" (PDF). Psychosomatic Medicine. 79 (2): 213–223. doi:10.1097/PSY.0000000000000378. PMID 27551988. S2CID 10955232.
  126. ^ Ess, Henk van "Crowdsourcing: how to find a crowd", ARD ZDF Akademie 2010, Berlin, p. 99,
  127. ^ a b c Doan, A.; Ramarkrishnan, R.; Halevy, A. (2011), "Crowdsourcing Systems on the World Wide Web" (PDF), Communications of the ACM, 54 (4): 86–96, doi:10.1145/1924421.1924442, S2CID 207184672
  128. ^ Brabham, Daren C. (2013), Crowdsourcing, MIT Press, p. 45
  129. ^ Blohm, Ivo; Zogaj, Shkodran; Bretschneider, Ulrich; Leimeister, Jan Marco (2018). "How to Manage Crowdsourcing Platforms Effectively" (PDF). California Management Review. 60 (2): 122–149. doi:10.1177/0008125617738255. S2CID 73551209.
  130. ^ Howe, Jeff (2008), (PDF), The International Achievement Institute, archived from the original (PDF) on 23 September 2015, retrieved 9 April 2012
  131. ^ "Crowdvoting: How Elo Limits Disruption". thevisionlab.com. 25 May 2017.
  132. ^ Robson, John (24 February 2012). . Canoe.ca. Archived from the original on 7 April 2012. Retrieved 31 March 2012.
  133. ^ . digitalagencymarketing.com. 2012. Archived from the original on 1 April 2012. Retrieved 29 March 2012.
  134. ^ Goldberg, Ken; Newsom, Gavin (12 June 2014). "Let's amplify California's collective intelligence". Citris-uc.org. Retrieved 14 June 2014.
  135. ^ Escoffier, N. and B. McKelvey (2014). "Using "Crowd-Wisdom Strategy" to Co-Create Market Value: Proof-of-Concept from the Movie Industry." in International Perspective on Business Innovation and Disruption in the Creative Industries: Film, Video, Photography, P. Wikstrom and R. DeFillippi, eds., UK: Edward Elgar Publishing Ltd, Chap. 11.
  136. ^ Block, A. B. (21 April 2010). "How boxoffice trading could flop". The Hollywood Reporter.
  137. ^ Chen, A. and R. Panaligan (2013). "Quantifying movie magic with Google search." Google White Paper, Industry Perspectives+User Insights
  138. ^ Williams, Jack (17 February 2017). "An Indoor Football Team Has Its Fans Call the Plays". The New York Times. ISSN 0362-4331. Retrieved 7 February 2018.
  139. ^ Prive, Tanya. "What Is Crowdfunding And How Does It Benefit The Economy". Forbes.com. Retrieved 2 July 2015.
  140. ^ Choy, Katherine; Schlagwein, Daniel (2016), "Crowdsourcing for a better world: On the relation between IT affordances and donor motivations in charitable crowdfunding", Information Technology & People, 29 (1): 221–247, doi:10.1108/ITP-09-2014-0215
  141. ^ Barnett, Chance. "Crowdfunding Sites In 2014". Forbes.com. Retrieved 2 July 2015.
  142. ^ a b c Agrawal, Ajay, Christian Catalini, and Avi Goldfarb. "Some Simple Economics of Crowdfunding." National Bureau of Economic Research (2014): 63–97
  143. ^ Leimeister, J.M.; Huber, M.; Bretschneider, U.; Krcmar, H. (2009), "Leveraging Crowdsourcing: Activation-Supporting Components for IT-Based Ideas Competition", Journal of Management Information Systems, 26 (1): 197–224, doi:10.2753/mis0742-1222260108, S2CID 17485373
  144. ^ Ebner, W.; Leimeister, J.; Krcmar, H. (2009), "Community Engineering for Innovations: The Ideas Competition as a method to nurture a Virtual Community for Innovations", R&D Management, 39 (4): 342–356, doi:10.1111/j.1467-9310.2009.00564.x, S2CID 16316321[dead link]
  145. ^ . DARPA Network Challenge. Archived from the original on 11 August 2011. Retrieved 28 November 2011.
  146. ^ "Social media web snares 'criminals'". New Scientist. Retrieved 4 April 2012.
  147. ^ "Beyond XPrize: The 10 Best Crowdsourcing Tools and Technologies". 20 February 2012. Retrieved 30 March 2012.
  148. ^ Cunard, C. (2010). "The Movie Research Experience gets audiences involved in filmmaking." The Daily Bruin, (19 July)
  149. ^ MacArthur, Kate. "Squadhelp wants your company to crowdsource better names (and avoid Boaty McBoatface)". chicagotribune.com. Retrieved 28 August 2017.
  150. ^ "Compete To Create Your Dream Home". FastCoexist.com. 4 June 2013. Retrieved 3 February 2014.
  151. ^ "Designers, clients forge ties on web". Boston Herald. 11 June 2012. Retrieved 3 February 2014.
  152. ^ Dolan, Shelagh, , Business Insider, archived from the original on 22 May 2018, retrieved 21 May 2018
  153. ^ Murison, Malek (19 April 2018), "LivingPackets uses IoT, crowdshipping to transform deliveries", Internet of Business, retrieved 19 April 2018
  154. ^ Biller, David; Sciaudone, Christina (19 June 2018), "Goldman Sachs, Soros Bet on the Uber of Brazilian Trucking", Bloomberg, retrieved 11 March 2019
  155. ^ Tyrsina, Radu, , Tehcnology Personalised, archived from the original on 3 October 2015, retrieved 1 October 2015
  156. ^ Geiger D, Rosemann M, Fielt E. Crowdsourcing information systems: a systems theory perspective. InProceedings of the 22nd Australasian Conference on Information Systems (ACIS 2011) 2011.
  157. ^ D, Powell (2015). "A new tool for crowdsourcing". МИР (Модернизация. Инновации. Развитие). 6 (2-2 (22)). ISSN 2079-4665.
  158. ^ Yang, J.; Adamic, L.; Ackerman, M. (2008), (PDF), Proceedings of the 9th ACM Conference on Electronic Commerce, doi:10.1145/1386790.1386829, S2CID 15553154, archived from the original (PDF) on 29 July 2020, retrieved 28 February 2012
  159. ^ "Mobile Crowdsourcing". Clickworker. Retrieved 10 December 2014.
  160. ^ Thebault-Spieker, Terveen, & Hecht. Avoiding the South Side and the Suburbs: The Geography of Mobile Crowdsourcing Markets.{{cite book}}: CS1 maint: multiple names: authors list (link)
  161. ^ Chatzimiloudis, Konstantinidis & Laoudias, Zeinalipour-Yazti. "Crowdsourcing with smartphones" (PDF). {{cite magazine}}: Cite magazine requires |magazine= (help)
  162. ^ Arkian, Hamid Reza; Diyanat, Abolfazl; Pourkhalili, Atefe (2017). "MIST: Fog-based data analytics scheme with cost-efficient resource provisioning for IoT crowdsensing applications". Journal of Network and Computer Applications. 82: 152–165. doi:10.1016/j.jnca.2017.01.012.
  163. ^ Felstiner, Alek (August 2011). "Working the Crowd: Employment and Labor Law in the Crowdsourcing Industry" (PDF). Berkeley Journal of Employment & Labor Law. 32: 150–151 – via WTF.
  164. ^ "View of Crowdsourcing: Libertarian Panacea or Regulatory Nightmare?". online-shc.com. Retrieved 26 May 2017.[permanent dead link]
  165. ^ a b Ross, J.; Irani, L.; Silberman, M.S.; Zaldivar, A.; Tomlinson, B. (2010). "Who are the Crowdworkers? Shifting Demographics in Mechanical Turk" (PDF). Chi 2010. Archived from the original (PDF) on 1 April 2011. Retrieved 28 February 2012.
  166. ^ Huff, Connor; Tingley, Dustin (1 July 2015). ""Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents". Research & Politics. 2 (3): 205316801560464. doi:10.1177/2053168015604648. ISSN 2053-1680. S2CID 7749084.
  167. ^ a b c d Moss, Aaron; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom; Litman, Leib (2022). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". psyarxiv.com. Retrieved 12 January 2023.
  168. ^ Levay, Kevin E.; Freese, Jeremy; Druckman, James N. (1 January 2016). "The Demographic and Political Composition of Mechanical Turk Samples". SAGE Open. 6 (1): 215824401663643. doi:10.1177/2158244016636433. ISSN 2158-2440. S2CID 147299692.
  169. ^ Hirth, M.; Hoßfeld, T.; Train-Gia, P. (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform (PDF)
  170. ^ a b c Brabham, Daren C. (2008). . First Monday. 13 (6). doi:10.5210/fm.v13i6.2159. Archived from the original on 24 November 2012. Retrieved 27 June 2012.
  171. ^ a b c Lakhani; et al. (2007). "The Value of Openness in Scientific Problem Solving" (PDF). Retrieved 26 February 2012. {{cite journal}}: Cite journal requires |journal= (help)
  172. ^ Brabham, Daren C. (2012). "Managing Unexpected Publics Online: The Challenge of Targeting Specific Groups with the Wide-Reaching Tool of the Internet". International Journal of Communication. 6: 20.
  173. ^ a b Brabham, Daren C. (2010). "Moving the Crowd at Threadless: Motivations for Participation in a Crowdsourcing Application". Information, Communication & Society. 13 (8): 1122–1145. doi:10.1080/13691181003624090. S2CID 143402410.
  174. ^ a b Brabham, Daren C. (2012). "The Myth of Amateur Crowds: A Critical Discourse Analysis of Crowdsourcing Coverage". Information, Communication & Society. 15 (3): 394–410. doi:10.1080/1369118X.2011.641991. S2CID 145675154.
  175. ^ Saxton, Oh, & Kishore (2013). "Rules of Crowdsourcing: Models, Issues, and Systems of Control". Information Systems Management. 30: 2–20. CiteSeerX 10.1.1.300.8026. doi:10.1080/10580530.2013.739883. S2CID 16811686.
  176. ^ a b Aitamurto, Tanja (2015). "Motivation Factors in Crowdsourced Journalism: Social Impact, Social Change, and Peer Learning". International Journal of Communication. 9: 3523–3543.
  177. ^ a b Kaufmann, N.; Schulze, T.; Viet, D. (2011). (PDF). Proceedings of the Seventeenth Americas Conference on Information Systems. Archived from the original (PDF) on 27 February 2012.
  178. ^ Brabham, Daren C. (2012). "Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning". Journal of Applied Communication Research. 40 (3): 307–328. doi:10.1080/00909882.2012.693940. S2CID 144807388.
  179. ^ Lietsala, Katri; Joutsen, Atte (2007). "Hang-a-rounds and True Believers: A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans". MindTrek 2007 Conference Proceedings.
  180. ^ (PDF). Unv.org. Archived from the original (PDF) on 2 December 2014. Retrieved 1 July 2015.
  181. ^ Chandler, D.; Kapelner, A. (2010). "Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets" (PDF). Journal of Economic Behavior & Organization. 90: 123–133. arXiv:1210.0962. doi:10.1016/j.jebo.2013.03.003. S2CID 8563262.
  182. ^ Aparicio, M.; Costa, C.; Braga, A. (2012). Proposing a system to support crowdsourcing (PDF). OSDOC '12 Proceedings of the Workshop on Open Source and Design of Communication. pp. 13–17. doi:10.1145/2316936.2316940. ISBN 9781450315258. S2CID 16494503.
  183. ^ Aitamurto, Landemore, Galli (2016). "Unmasking the Crowd: Participants' Motivation Factors, Expectations, and Profile in a Crowdsourced Law Reform". Information, Communication & Society.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  184. ^ Ipeirotis, Panagiotis G. (10 March 2010). "Demographics of Mechanical Turk". {{cite journal}}: Cite journal requires |journal= (help)
  185. ^ Ross, Joel; Irani, Lilly; Silberman, M. Six; Zaldivar, Andrew; Tomlinson, Bill (10 April 2010). "Who are the crowdworkers? shifting demographics in mechanical turk". CHI '10 Extended Abstracts on Human Factors in Computing Systems. CHI EA '10. New York, USA: Association for Computing Machinery: 2863–2872. doi:10.1145/1753846.1753873. ISBN 978-1-60558-930-5. S2CID 11386257.
  186. ^ Quinn, Alexander J.; Bederson, Benjamin B. (2011). "Human Computation:A Survey and Taxonomy of a Growing Field, CHI 2011 [Computer Human Interaction conference], May 7–12, 2011, Vancouver, BC, Canada" (PDF). Retrieved 30 June 2015.
  187. ^ a b Hauser, David J.; Moss, Aaron J.; Rosenzweig, Cheskie; Jaffe, Shalom N.; Robinson, Jonathan; Litman, Leib (3 November 2022). "Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk". Behavior Research Methods. doi:10.3758/s13428-022-01999-x. ISSN 1554-3528. PMID 36326997.
  188. ^ Prpić, J; Shukla, P.; Roth, Y.; Lemoine, J.F. (2015). "A Geography of Participation in IT-Mediated Crowds". Proceedings of the Hawaii International Conference on Systems Sciences 2015. SSRN 2494537.
  189. ^ a b Borst, Irma. . Archived from the original on 12 September 2015. Retrieved 9 February 2015.
  190. ^ Ipeirotis; Provost; Wang (2010). (PDF). Archived from the original (PDF) on 9 August 2012. Retrieved 28 February 2012. {{cite journal}}: Cite journal requires |journal= (help)
  191. ^ Lukyanenko, Roman; Parsons, Jeffrey; Wiersma, Yolanda (2014). "The IQ of the Crowd: Understanding and Improving Information Quality in Structured User-Generated Content". Information Systems Research. 25 (4): 669–689. doi:10.1287/isre.2014.0537.
  192. ^ Hauser, David; Paolacci, Gabriele; Chandler, Jesse, "Evidence and Solutions", Handbook of Research Methods in Consumer Psychology, doi:10.4324/9781351137713-17, S2CID 150882624, retrieved 12 January 2023
  193. ^ a b Moss, Aaron J; Rosenzweig, Cheskie; Jaffe, Shalom Noach; Gautam, Richa; Robinson, Jonathan; Litman, Leib (11 June 2021). "Bots or inattentive humans? Identifying sources of low-quality data in online platforms". doi:10.31234/osf.io/wr8ds. S2CID 236288817. {{cite journal}}: Cite journal requires |journal= (help)
  194. ^ Goerzen, Thomas; Kundisch, Dennis (11 August 2016). "Can the Crowd Substitute Experts in Evaluation of Creative Ideas? An Experimental Study Using Business Models". AMCIS 2016 Proceedings.
  195. ^ Burnap, Alex; Ren, Alex J.; Papazoglou, Giannis; Gerth, Richard; Gonzalez, Richard; Papalambros, Panos. (PDF). Archived from the original (PDF) on 29 October 2015. Retrieved 19 May 2015. {{cite journal}}: Cite journal requires |journal= (help)
  196. ^ Kurve, Aditya; Miller, David J.; Kesidis, George (30 May 2014). "Multicategory Crowdsourcing Accounting for Variable Task Difficulty, Worker Skill, and Worker Intention". IEEE Kde (99).
  197. ^ Hirth; Hoßfeld; Tran-Gia (2011), Human Cloud as Emerging Internet Application – Anatomy of the Microworkers Crowdsourcing Platform (PDF)
  198. ^ PhD, Aaron Moss (18 September 2018). "After the Bot Scare: Understanding What's Been Happening With Data Collection on MTurk and How to Stop It". CloudResearch. Retrieved 12 January 2023.
  199. ^ Ipeirotis, Panagiotis G. (2010). "Analyzing the Amazon Mechanical Turk Marketplace" (PDF). XRDS: Crossroads, the ACM Magazine for Students. 17 (2): 16–21. doi:10.1145/1869086.1869094. S2CID 6472586. SSRN 1688194. Retrieved 2 October 2018.
  200. ^ a b Hosaka, Tomoko A. (April 2008). "Facebook asks users to translate for free". NBC News.
  201. ^ Britt, Darice. . Archived from the original on 1 July 2014. Retrieved 4 December 2012.
  202. ^ Woods, Dan (28 September 2009). "The Myth of Crowdsourcing". Forbes. Retrieved 4 December 2012.
  203. ^ a b Aitamurto, Tanja; Leiponen, Aija. "The Promise of Idea Crowdsourcing: Benefits, Contexts, Limitations". Ideasproject.com. Retrieved 2 July 2015.
  204. ^ . Latin American Herald Tribune. Archived from the original on 11 March 2021. Retrieved 23 November 2016.
  205. ^ Kleeman, Frank (2008). "Un(der)paid Innovators: The Commercial Utilization of Consumer Work through Crowdsourcing". Sti-studies.de. Retrieved 2 July 2015.
  206. ^ Jason (2011). . Crowdsourcing.org. Archived from the original on 3 July 2015. Retrieved 2 July 2015.
  207. ^ Dupree, Steven (2014). "Crowdfunding 101: Pros and Cons". Gsb.stanford.edu. Retrieved 2 July 2015.
  208. ^ "Fair Labor Standards Act Advisor". Retrieved 28 February 2012.
  209. ^ Hara, Kotaro; Adams, Abigail; Milland, Kristy; Savage, Saiph; Callison-Burch, Chris; Bigham, Jeffrey P. (21 April 2018). "A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. New York, USA: ACM: 1–14. doi:10.1145/3173574.3174023. ISBN 9781450356206. S2CID 5040507.
  210. ^ Greg Norcie, 2011, "Ethical and practical considerations for compensation of crowdsourced research participants", CHI WS on Ethics Logs and VideoTape: Ethics in Large Scale Trials & User Generated Content, [1] 2012-06-30 at the Wayback Machine, accessed 30 June 2015.
  211. ^ Busarovs, Aleksejs (2013). "Ethical Aspects of Crowdsourcing, or is it a Modern Form of Exploitation" (PDF). International Journal of Economics & Business Administration. 1 (1): 3–14. doi:10.35808/ijeba/1. Retrieved 26 November 2014.
  212. ^ Paolacci, G; Chandler, J; Ipeirotis, P.G. (2010). "Running experiments on Amazon Mechanical Turk". Judgment and Decision Making. 5 (5): 411–419. doi:10.1017/S1930297500002205. hdl:1765/31983. S2CID 14476283.
  213. ^ Graham, Mark; Hjorth, Isis; Lehdonvirta, Vili (1 May 2017). "Digital labour and development: impacts of global digital labour platforms and the gig economy on worker livelihoods". Transfer: European Review of Labour and Research. 23 (2): 135–162. doi:10.1177/1024258916687250. ISSN 1024-2589. PMC 5518998. PMID 28781494.
  214. ^ The Crowdsourcing Scam (Dec. 2014), The Baffler, No. 26
  215. ^ Salehi; et al. (2015). "We Are Dynamo: Overcoming Stalling and Friction in Collective Action for Crowd Workers" (PDF). Retrieved 16 June 2015. {{cite journal}}: Cite journal requires |journal= (help)
  216. ^ Irani, Lilly C.; Silberman, M. Six (27 April 2013). "Turkopticon". Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York, USA: ACM: 611–620. doi:10.1145/2470654.2470742. ISBN 9781450318990. S2CID 207203679.
  • Reinhold, S., & Dolnicar, S. (2018). How Airbnb creates value. Peer-to-Peer Accommodation Networks; Dolnicar, S., Ed, 39–53.

External links

  •   Crowdsourcing at Wikibooks
  •   Media related to Crowdsourcing at Wikimedia Commons

crowdsourcing, crowd, work, redirects, here, performing, arts, term, audience, participation, this, article, written, like, personal, reflection, personal, essay, argumentative, essay, that, states, wikipedia, editor, personal, feelings, presents, original, ar. Crowd work redirects here For the performing arts term see audience participation This article is written like a personal reflection personal essay or argumentative essay that states a Wikipedia editor s personal feelings or presents an original argument about a topic Please help improve it by rewriting it in an encyclopedic style September 2022 Learn how and when to remove this template message Crowdsourcing involves a large group of dispersed participants contributing or producing goods or services including ideas votes micro tasks and finances for payment or as volunteers Contemporary crowdsourcing often involves digital platforms to attract and divide work between participants to achieve a cumulative result Crowdsourcing is not limited to online activity however and there are various historical examples of crowdsourcing The word crowdsourcing is a portmanteau of crowd and outsourcing 1 2 3 In contrast to outsourcing crowdsourcing usually involves less specific and more public groups of participants 4 5 6 This graphic symbolizes the use of ideas from a wide range of individuals as used in crowdsourcing Advantages of using crowdsourcing include lowered costs improved speed improved quality increased flexibility and or increased scalability of the work as well as promoting diversity 7 8 Crowdsourcing methods include competitions virtual labor markets open online collaboration and data donation 8 9 10 11 Some forms of crowdsourcing such as in idea competitions or innovation contests provide ways for organizations to learn beyond the base of minds provided by their employees e g LEGO Ideas 12 13 promotion Commercial platforms such as Amazon Mechanical Turk match microtasks submitted by requesters to workers who perform them Crowdsourcing is also used by nonprofit organizations to develop common goods such as Wikipedia 14 Contents 1 Definitions 2 Historical examples 2 1 Timeline of crowdsourcing examples 2 2 Early competitions 3 Applications 3 1 In science 3 1 1 Astronomy 3 1 2 Energy system research 3 1 3 Genealogy research 3 1 4 Ornithology 3 1 5 Seismology 3 2 In journalism 3 2 1 Data donation 3 3 In public policy 3 4 Language related data 3 5 In product design 3 6 In business 3 7 In market research 3 8 Other examples 4 Methods 4 1 Crowdvoting 4 2 Crowdfunding 4 3 Inducement prize contests 4 4 Implicit crowdsourcing 4 5 Other types 5 Demographics of the crowd 5 1 Motivations 5 1 1 Contributors 6 Limitations and controversies 6 1 Impact of crowdsourcing on product quality 6 2 Entrepreneurs contribute less capital themselves 6 3 Increased number of funded ideas 6 4 Concerns 7 See also 8 References 9 External linksDefinitions EditThe term crowdsourcing was coined in 2006 by two editors at Wired Jeff Howe and Mark Robinson to describe how businesses were using the Internet to outsource work to the crowd which quickly led to the portmanteau crowdsourcing 15 Howe published a definition for the term in a blog post in June 2006 16 Simply defined crowdsourcing represents the act of a company or institution taking a function once performed by employees and outsourcing it to an undefined and generally large network of people in the form of an open call This can take the form of peer production when the job is performed collaboratively but is also often undertaken by sole individuals The crucial prerequisite is the use of the open call format and the large network of potential laborers Daren C Brabham defined crowdsourcing as an online distributed problem solving and production model 17 Kristen L Guth and Brabham found that the performance of ideas offered in crowdsourcing platforms are affected not only by their quality but also by the communication among users about the ideas and presentation in the platform itself 18 After studying more than 40 definitions of crowdsourcing in the scientific and popular literature Enrique Estelles Arolas and Fernando Gonzalez Ladron de Guevara researchers at the Technical University of Valencia developed a new integrating definition 3 Crowdsourcing can either take an explicit or an implicit route Explicit crowdsourcing lets users work together to evaluate share and build different specific tasks while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing With explicit crowdsourcing users can evaluate particular items like books or webpages or share by posting products or items Users can also build artifacts by providing information and editing other people s work Implicit crowdsourcing can take two forms standalone and piggyback Standalone allows people to solve problems as a side effect of the task they are doing whereas piggyback takes users information from a third party website to gather information Despite the multiplicity of definitions for crowdsourcing one constant has been the broadcasting of problems to the public and an open call for contributions to help solve the problem original research Members of the public submit solutions that are then owned by the entity who originally broadcast the problem In some cases the contributor of the solution is compensated monetarily with prizes or public recognition In other cases the only rewards may be praise or intellectual satisfaction Crowdsourcing may produce solutions from amateurs or volunteers working in their spare time from experts or from small businesses 15 Historical examples EditThis section has multiple issues Please help improve it or discuss these issues on the talk page Learn how and when to remove these template messages This section needs additional citations for verification Please help improve this article by adding citations to reliable sources Unsourced material may be challenged and removed September 2022 Learn how and when to remove this template message This section contains embedded lists that may be poorly defined unverified or indiscriminate Please help to clean it up to meet Wikipedia s quality standards Where appropriate incorporate items into the main body of the article September 2022 Learn how and when to remove this template message While the term crowdsourcing was popularized online to describe Internet based activities 17 some examples of projects in retrospect can be described as crowdsourcing Timeline of crowdsourcing examples Edit 618 907 Tang dynasty introduced the Joint Stock Company the earliest form of crowdfunding 1567 King Philip II of Spain offered a cash prize for calculating the longitude of a vessel whilst at sea 1714 The longitude rewards When the British government was trying to find a way to measure a ship s longitudinal position they offered the public a monetary prize to whoever came up with the best solution 19 1783 King Louis XVI offered an award to the person who could make the alkali by decomposing sea salt by the simplest and most economic method 19 1848 Matthew Fontaine Maury distributed 5000 copies of his Wind and Current Charts free of charge on the condition that sailors returned a standardized log of their voyage to the U S Naval Observatory By 1861 he had distributed 200 000 copies free of charge on the same conditions 20 1849 A network of some 150 volunteer weather observers all over the USA was set up as a part of the Smithsonian Institution s Meteorological Project started by the Smithsonian s first Secretary Joseph Henry who used the telegraph to gather volunteers data and create a large weather map making new information available to the public daily For instance volunteers tracked a tornado passing through Wisconsin and sent the findings via telegraph to the Smithsonian Henry s project is considered the origin of what later became the National Weather Service Within a decade the project had more than 600 volunteer observers and had spread to Canada Mexico Latin America and the Caribbean 21 1884 Publication of the Oxford English Dictionary 800 volunteers catalogued words to create the first fascicle of the OED 19 1916 Planters Peanuts contest The Mr Peanut logo was designed by a 14 year old boy who won the Planter Peanuts logo contest 19 1957 Jorn Utzon was selected as winner of the design competition for the Sydney Opera House 19 1970 French amateur photo contest C etait Paris en 1970 This Was Paris in 1970 was sponsored by the city of Paris France Inter radio and the Fnac 14 000 photographers produced 70 000 black and white prints and 30 000 color slides of the French capital to document the architectural changes of Paris Photographs were donated to the Bibliotheque historique de la ville de Paris 22 1979 Robert Axelrod invited academics on line to submit FORTRAN algorithms to play the repeated Prisoner s Dilemma A tit for tat algorithm ended up in first place 23 1991 Linus Torvalds began work on the Linux operating system and invited programmers around the world to contribute code dubious discuss 1996 The Hollywood Stock Exchange was founded It allowed buying and selling of shares 19 1997 British rock band Marillion raised 60 000 from their fans to help finance their U S tour 19 1999 SETI home was launched by the University of California Berkeley Volunteers can contribute to searching for signals that might come from extraterrestrial intelligence by installing a program that uses idle computer time for analyzing chunks of data recorded by radio telescopes involved in the SERENDIP program 2000 JustGiving was established This online platform allows the public to help raise money for charities 19 2000 UNV Online Volunteering service launched Connecting people who commit their time and skills over the Internet to help organizations address development challenges 24 2000 iStockPhoto was founded The free stock imagery website allows the public to contribute to and receive commission for their contributions 25 2001 Launch of Wikipedia Free access free content Internet encyclopedia 26 2001 Foundation of Topcoder crowdsourcing software development company 27 28 2004 OpenStreetMap a collaborative project to create a free editable map of the world was launched 29 2004 Toyota s first Dream car art contest Children were asked globally to draw their dream car of the future 30 2005 Kodak s Go for the Gold contest Kodak asked anyone to submit a picture of a personal victory 30 2005 Amazon Mechanical Turk MTurk was launched publicly on November 2 2005 It enables businesses to hire remotely located crowdworkers to perform discrete on demand tasks that computers are currently unable to do 31 2006 Waze then named FreeMap Israel a community oriented GPS app was created It allows users to submit road information and route data based on location such as reports of car accidents or traffic and integrates that data into its routing algorithms for all users of the app 2010 The 1947 Partition Archive an oral history project that asked community members around the world to document oral histories from aging witnessed of a significant but under documented historical event the 1947 Partition of India was founded 2011 Casting of Flavours Do us a flavor in the USA a campaign launched by PepsiCo s Lay s in Spain The campaign was to create a new flavor for the snack where the consumers were directly involved in its formation 32 Early competitions Edit Crowdsourcing has often been used in the past as a competition to discover a solution The French government proposed several of these competitions often rewarded with Montyon Prizes 33 These included the Leblanc process or the Alkali prize where a reward was provided for separating the salt from the alkali and the Fourneyron s turbine when the first hydraulic commercial turbine was developed 34 In response to a challenge from the French government Nicolas Appert won a prize for inventing a new way of food preservation that involved sealing food in air tight jars 35 The British government provided a similar reward to find an easy way to determine a ship s longitude in the Longitude Prize During the Great Depression out of work clerks tabulated higher mathematical functions in the Mathematical Tables Project as an outreach project 36 unreliable source One of the largest crowdsourcing campaigns was a public design contest in 2010 hosted by the Indian government s finance ministry to create a symbol for the Indian rupee Thousands of people sent in entries before the government zeroed in on the final symbol based on the Devanagari script using the letter Ra 37 Applications EditSee also List of crowdsourcing projects A number of motivations exist for businesses to use crowdsourcing to accomplish their tasks These include the ability to offload peak demand access cheap labor and information generate better results access a wider array of talent than what is present in one organization and undertake problems that would have been too difficult to solve internally 38 Crowdsourcing allows businesses to submit problems on which contributors can work on topics such as science manufacturing biotech and medicine optionally with monetary rewards for successful solutions Although crowdsourcing complicated tasks can be difficult simple work tasks specify can be crowdsourced cheaply and effectively 39 Crowdsourcing also has the potential to be a problem solving mechanism for government and nonprofit use 40 Urban and transit planning are prime areas for crowdsourcing For example from 2008 to 2009 a crowdsourcing project for transit planning in Salt Lake City was created to test the public participation process 41 Another notable application of crowdsourcing for government problem solving is Peer to Patent which was an initiative to improve patent quality in the United States through gathering public input in a structured productive manner 42 Researchers have used crowdsourcing systems such as Amazon Mechanical Turk or CloudResearch to aid their research projects by crowdsourcing some aspects of the research process such as data collection parsing and evaluation to the public Notable examples include using the crowd to create speech and language databases 43 44 to conduct user studies 45 and to run behavioral science surveys and experiments 46 Crowdsourcing systems provided researchers with the ability to gather large amounts of data and helped researchers to collect data from populations and demographics they may not have access to locally 47 failed verification Artists have also used crowdsourcing systems In a project called the Sheep Market Aaron Koblin used Mechanical Turk to collect 10 000 drawings of sheep from contributors around the world 48 Artist Sam Brown leveraged the crowd by asking visitors of his website explodingdog to send him sentences to use as inspirations for his paintings 49 Art curator Andrea Grover argues that individuals tend to be more open in crowdsourced projects because they are not being physically judged or scrutinized 50 As with other types of uses artists use crowdsourcing systems to generate and collect data The crowd also can be used to provide inspiration and to collect financial support for an artist s work 51 In navigation systems crowdsourcing from 100 million drivers were used by INRIX to collect users driving times to provide better GPS routing and real time traffic updates 52 In science Edit Astronomy Edit Crowdsourcing in astronomy was used in the early 19th century by astronomer Denison Olmsted After being awakened in a late November night due to a meteor shower taking place Olmsted noticed a pattern in the shooting stars Olmsted wrote a brief report of this meteor shower in the local newspaper As the cause of Falling Stars is not understood by meteorologists it is desirable to collect all the facts attending this phenomenon stated with as much precision as possible Olmsted wrote to readers in a report subsequently picked up and pooled to newspapers nationwide Responses came pouring in from many states along with scientists observations sent to the American Journal of Science and Arts 53 These responses helped him to make a series of scientific breakthroughs including observing the fact that meteor showers are seen nationwide and fall from space under the influence of gravity The responses also allowed him to approximate a velocity for the meteors citation needed A more recent version of crowdsourcing in astronomy is NASA s photo organizing project 54 which asked internet users to browse photos taken from space and try to identify the location the picture is documenting 55 Behavioral scienceIn the field of behavioral science crowdsourcing is often used to gather data and insights on human behavior and decision making Researchers may create online surveys or experiments that are completed by a large number of participants allowing them to collect a diverse and potentially large amount of data 46 Crowdsourcing can also be used to gather real time data on behavior such as through the use of mobile apps that track and record users activities and decision making 56 The use of crowdsourcing in behavioral science has the potential to greatly increase the scope and efficiency of research and has been used in studies on topics such as psychology research 57 political attitudes 58 and social media use 59 Energy system research Edit Energy system models require large and diverse datasets increasingly so given the trend towards greater temporal and spatial resolution 60 In response there have been several initiatives to crowdsource this data Launched in December 2009 OpenEI is a collaborative website run by the US government that provides open energy data 61 62 While much of its information is from US government sources the platform also seeks crowdsourced input from around the world 63 The semantic wiki and database Enipedia also publishes energy systems data using the concept of crowdsourced open information Enipedia went live in March 2011 64 65 184 188 Genealogy research Edit Genealogical research used crowdsourcing techniques long before personal computers were common Beginning in 1942 members of the Church of Jesus Christ of Latter day Saints encouraged members to submit information about their ancestors The submitted information was gathered together into a single collection In 1969 to encourage more participation the church started the three generation program In this program church members were asked to prepare documented family group record forms for the first three generations The program was later expanded to encourage members to research at least four generations and became known as the four generation program 66 Institutes that have records of interest to genealogical research have used crowds of volunteers to create catalogs and indices to records citation needed Genetic genealogy researchGenetic genealogy is a combination of traditional genealogy with genetics The rise of personal DNA testing after the turn of the century by companies such as Gene by Gene FTDNA GeneTree 23andMe and Ancestry com has led to public and semi public databases of DNA testing using crowdsourcing techniques Citizen science projects have included support organization and dissemination of personal DNA genetic testing Similar to amateur astronomy citizen scientists encouraged by volunteer organizations like the International Society of Genetic Genealogy 67 have provided valuable information and research to the professional scientific community 68 The Genographic Project which began in 2005 is a research project carried out by the National Geographic Society s scientific team to reveal patterns of human migration using crowdsourced DNA testing and reporting of results 69 Ornithology Edit Another early example of crowdsourcing occurred in the field of ornithology On 25 December 1900 Frank Chapman an early officer of the National Audubon Society initiated a tradition dubbed the Christmas Day Bird Census The project called birders from across North America to count and record the number of birds in each species they witnessed on Christmas Day The project was successful and the records from 27 different contributors were compiled into one bird census which tallied around 90 species of birds 70 This large scale collection of data constituted an early form of citizen science the premise upon which crowdsourcing is based In the 2012 census more than 70 000 individuals participated across 2 369 bird count circles 71 Christmas 2014 marked the National Audubon Society s 115th annual Christmas Bird Count Seismology Edit The European Mediterranean Seismological Centre EMSC has developed a seismic detection system by monitoring the traffic peaks on its website and analyzing keywords used on Twitter 72 In journalism Edit See also Collaborative journalism and Citizen journalism Crowdsourcing is increasingly used in professional journalism Journalists are able to organize crowdsourced information by fact checking the information and then using the information they have gathered in their articles as they see fit citation needed A daily newspaper in Sweden has successfully used crowdsourcing in investigating the home loan interest rates in the country in 2013 2014 which resulted in over 50 000 submissions 73 A daily newspaper in Finland crowdsourced an investigation into stock short selling in 2011 2012 and the crowdsourced information led to revelations of a tax evasion system by a Finnish bank The bank executive was fired and policy changes followed 74 TalkingPointsMemo in the United States asked its readers to examine 3 000 emails concerning the firing of federal prosecutors in 2008 The British newspaper The Guardian crowdsourced the examination of hundreds of thousands of documents in 2009 75 Data donation Edit Data donation is a crowdsourcing approach to gather digital data It is used by researchers and organizations to gain access to data from online platforms websites search engines and apps and devices Data donation projects usually rely on participants volunteering their authentic digital profile information Examples include DataSkop developed by Algorithm Watch a non profit research organization in Germany which accessed data on social media algorithms and automated decision making systems 76 77 Mozilla Rally from the Mozilla Foundation is a browser extension for adult participants in the US 78 to provide access to their data for research projects 79 The Australian Search Experience and Ad Observatory projects set up in 2021 by researchers at the ARC Centre of Excellence for Automated Decision Making and Society ADM S in Australia was using data donations to analyze how Google personalized search results and examine how Facebook s algorithmic advertising model worked 80 81 The Citizen Browser Project developed by The Markup was designed to measure how disinformation traveled across social media platforms over time 82 In public policy Edit Crowdsourcing public policy and the production of public services is also referred to as citizen sourcing While some scholars argue crowdsourcing for this purpose as a policy tool 83 or a definite means of co production 84 others question that and argue that crowdsourcing should be considered just as a technological enabler that simply increases speed and ease of participation 85 Crowdsourcing can also play a role in democratization 86 The first conference focusing on Crowdsourcing for Politics and Policy took place at Oxford University under the auspices of the Oxford Internet Institute in 2014 Research has emerged since 2012 87 which focused on the use of crowdsourcing for policy purposes 88 89 These include experimentally investigating the use of Virtual Labor Markets for policy assessment 90 and assessing the potential for citizen involvement in process innovation for public administration 91 Governments across the world are increasingly using crowdsourcing for knowledge discovery and civic engagement citation needed Iceland crowdsourced their constitution reform process in 2011 and Finland has crowdsourced several law reform processes to address their off road traffic laws The Finnish government allowed citizens to go on an online forum to discuss problems and possible resolutions regarding some off road traffic laws citation needed The crowdsourced information and resolutions would then be passed on to legislators to refer to when making a decision allowing citizens to contribute to public policy in a more direct manner 92 93 Palo Alto crowdsources feedback for its Comprehensive City Plan update in a process started in 2015 94 The House of Representatives in Brazil has used crowdsourcing in policy reforms 95 NASA used crowdsourcing to analyze large sets of images As part of the Open Government Initiative of the Obama Administration the General Services Administration collected and amalgamated suggestions for improving federal websites 95 For part of the Obama and Trump Administrations the We the People system collected signatures on petitions which were entitled to an official response from the White House once a certain number had been reached Several U S federal agencies ran inducement prize contests including NASA and the Environmental Protection Agency 96 95 Language related data Edit Crowdsourcing has been used extensively for gathering language related data For dictionary work crowdsourcing was applied over a hundred years ago by the Oxford English Dictionary editors using paper and postage It has also been used for collecting examples of proverbs on a specific topic e g religious pluralism for a printed journal 97 Crowdsourcing language related data online has proven very effective and many dictionary compilation projects used crowdsourcing It is used particularly for specialist topics and languages that are not well documented such as for the Oromo language 98 Software programs have been developed for crowdsourced dictionaries such as WeSay 99 A slightly different form of crowdsourcing for language data was the online creation of scientific and mathematical terminology for American Sign Language 100 In linguistics crowdsourcing strategies have been applied to estimate word knowledge vocabulary size and word origin 101 Implicit crowdsourcing on social media has also approximating sociolinguistic data efficiently Reddit conversations in various location based subreddits were analyzed for the presence of grammatical forms unique to a regional dialect These were then used to map the extent of the speaker population The results could roughly approximate large scale surveys on the subject without engaging in field interviews 102 Mining publicly available social media conversations can be used as a form of implicit crowdsourcing to approximate the geographic extent of speaker dialects 102 Proverb collection is also being done via crowdsourcing on the Web most notably for the Pashto language of Afghanistan and Pakistan 103 104 105 Crowdsourcing has been extensively used to collect high quality gold standards for creating automatic systems in natural language processing e g named entity recognition entity linking 106 In product design Edit LEGO allows users to work on new product designs while conducting requirements testing Any user can provide a design for a product and other users can vote on the product Once the submitted product has received 10 000 votes it will be formally reviewed in stages and go into production with no impediments such as legal flaws identified The creator receives royalties from the net income 107 In business Edit Homeowners can use Airbnb to list their accommodation or unused rooms Owners set their own nightly weekly and monthly rates and accommodations The business in turn charges guests and hosts a fee Guests usually end up spending between 9 and 15 108 They have to pay a booking fee every time they book a room The landlord in turn pays a service fee for the amount due The company has 1 500 properties in 34 000 cities in more than 190 countries citation needed In market research Edit Crowdsourcing is frequently used in market research as a way to gather insights and opinions from a large number of consumers 109 Companies may create online surveys or focus groups that are open to the general public allowing them to gather a diverse range of perspectives on their products or services This can be especially useful for companies seeking to understand the needs and preferences of a particular market segment or to gather feedback on the effectiveness of their marketing efforts The use of crowdsourcing in market research allows companies to quickly and efficiently gather a large amount of data and insights that can inform their business decisions 110 Other examples Edit Geography Volunteered geographic information VGI is geographic information generated through crowdsourcing as opposed to traditional methods of Professional Geographic Information PGI 111 In describing the built environment VGI has many advantages over PGI primarily perceived currency 112 accuracy 113 and authority 114 OpenStreetMap is an example of crowdsourced mapping project 29 Engineering Many companies are introducing crowdsourcing to grow their engineering capabilities and find solutions to unsolved technical challenges and the need to adopt newest technologies such as 3D printing and the IOT citation needed Libraries museums and archives Newspaper text correction at the National Library of Australia was an early influential example of work with text transcriptions for crowdsourcing in cultural heritage institutions 115 The Steve Museum project provided a prototype for categorizing artworks 116 Crowdsourcing is used in libraries for OCR corrections on digitized texts for tagging and for funding especially in the absence of financial and human means Volunteers can contribute explicitly with conscious effort or implicitly without being known by turning the text on the raw newspaper image into human corrected digital form 117 Agriculture Crowdsource research also applies to the field of agriculture Crowdsourcing can be used to help farmers and experts to dentify different types of weeds 118 from the fields and also to provide assistance in removing the weeds Cheating in bridge Boye Brogeland initiated a crowdsourcing investigation of cheating by top level bridge players that showed several players as guilty which led to their suspension 119 Open Source Software and Crowdsourcing software development have been used extensively in the domain of software development Healthcare Research has emerged that outlined the use of crowdsourcing techniques in the public health domain 120 121 122 The collective intelligence outcomes from crowdsourcing are being generated in three broad categories of public health care health promotion 121 health research 123 and health maintenance 124 Crowdsourcing also enables researchers to move from small homogeneous groups of participants to large heterogenous groups 125 beyond convenience samples such as students or higher educated people The SESH group focuses on using crowdsourcing to improve health Methods EditInternet and digital technologies have massively expanded the opportunities for crowdsourcing However the effect of user communication and platform presentation can have a major bearing on the success of an online crowdsourcing project 18 The crowdsourced problem can range from huge tasks such as finding alien life or mapping earthquake zones or very small identifying images Some examples of successful crowdsourcing themes are problems that bug people things that make people feel good about themselves projects that tap into niche knowledge of proud experts and subjects that people find sympathetic 126 Crowdsourcing can either take an explicit or an implicit route Explicit crowdsourcing lets users work together to evaluate share and build different specific tasks while implicit crowdsourcing means that users solve a problem as a side effect of something else they are doing With explicit crowdsourcing users can evaluate particular items like books or webpages or share by posting products or items Users can also build artifacts by providing information and editing other people s work citation needed Implicit crowdsourcing can take two forms standalone and piggyback Standalone allows people to solve problems as a side effect of the task they are actually doing whereas piggyback takes users information from a third party website to gather information 127 This is also known as data donation In his 2013 book Crowdsourcing Daren C Brabham puts forth a problem based typology of crowdsourcing approaches 128 Knowledge discovery and management is used for information management problems where an organization mobilizes a crowd to find and assemble information It is ideal for creating collective resources Distributed human intelligence tasking HIT is used for information management problems where an organization has a set of information in hand and mobilizes a crowd to process or analyze the information It is ideal for processing large data sets that computers cannot easily do Amazon Mechanical Turk uses this approach Broadcast search is used for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem that has an objective provable right answer It is ideal for scientific problem solving Peer vetted creative production is used for ideation problems where an organization mobilizes a crowd to come up with a solution to a problem which has an answer that is subjective or dependent on public support It is ideal for design aesthetic or policy problems Ivo Blohm identifies four types of Crowdsourcing Platforms Microtasking Information Pooling Broadcast Search and Open Collaboration They differ in the diversity and aggregation of contributions that are created The diversity of information collected can either be homogenous or heterogenous The aggregation of information can either be selective or integrative definition needed 129 Some common categories of crowdsourcing have been used effectively in the commercial world include crowdvoting crowdsolving crowdfunding microwork creative crowdsourcing crowdsource workforce management and inducement prize contests 130 Crowdvoting Edit Crowdvoting occurs when a website gathers a large group s opinions and judgments on a certain topic Some crowdsourcing tools and platforms allow participants to rank each other s contributions e g in answer to the question What is one thing we can do to make Acme a great company One common method for ranking is like counting where the contribution with the most like votes ranks first This method is simple and easy to understand but it privileges early contributions which have more time to accumulate votes citation needed In recent years several crowdsourcing companies have begun to use pairwise comparisons backed by ranking algorithms Ranking algorithms do not penalize late contributions citation needed They also produce results quicker Ranking algorithms have proven to be at least 10 times faster than manual stack ranking 131 One drawback however is that ranking algorithms are more difficult to understand than vote counting The Iowa Electronic Market is a prediction market that gathers crowds views on politics and tries to ensure accuracy by having participants pay money to buy and sell contracts based on political outcomes 132 Some of the most famous examples have made use of social media channels Domino s Pizza Coca Cola Heineken and Sam Adams have crowdsourced a new pizza bottle design beer and song respectively 133 A website called Threadless selected the T shirts it sold by having users provide designs and vote on the ones they like which are then printed and available for purchase 17 The California Report Card CRC a program jointly launched in January 2014 by the Center for Information Technology Research in the Interest of Society 134 and Lt Governor Gavin Newsom is an example of modern day crowd voting Participants access the CRC online and vote on six timely issues Through principal component analysis the users are then placed into an online cafe in which they can present their own political opinions and grade the suggestions of other participants This system aims to effectively involve the greater public in relevant political discussions and highlight the specific topics with which people are most concerned Crowdvoting s value in the movie industry was shown when in 2009 a crowd accurately predicted the success or failure of a movie based on its trailer 135 136 a feat that was replicated in 2013 by Google 137 On Reddit users collectively rate web content discussions and comments as well as questions posed to persons of interest in AMA and AskScience online interviews cleanup needed In 2017 Project Fanchise purchased a team in the Indoor Football League and created the Salt Lake Screaming Eagles a fan run team Using a mobile app the fans voted on the day to day operations of the team the mascot name signing of players and even offensive play calling during games 138 Crowdfunding Edit Main article Crowdfunding Crowdfunding is the process of funding projects by a multitude of people contributing a small amount to attain a certain monetary goal typically via the Internet 139 Crowdfunding has been used for both commercial and charitable purposes 140 The crowdfuding model that has been around the longest is rewards based crowdfunding This model is where people can prepurchase products buy experiences or simply donate While this funding may in some cases go towards helping a business funders are not allowed to invest and become shareholders via rewards based crowdfunding 141 Individuals businesses and entrepreneurs can showcase their businesses and projects by creating a profile which typically includes a short video introducing their project a list of rewards per donation and illustrations through images citation needed Funders make monetary contribution for numerous reasons They connect to the greater purpose of the campaign such as being a part of an entrepreneurial community and supporting an innovative idea or product 142 They connect to a physical aspect of the campaign like rewards and gains from investment 142 They connect to the creative display of the campaign s presentation They want to see new products before the public 142 The dilemma for equity crowdfunding in the US as of 2012 was during a refinement process for the regulations of the Securities and Exchange Commission which had until 1 January 2013 to tweak the fundraising methods The regulators were overwhelmed trying to regulate Dodd Frank and all the other rules and regulations involving public companies and the way they traded Advocates of regulation claimed that crowdfunding would open up the flood gates for fraud called it the wild west of fundraising and compared it to the 1980s days of penny stock cold call cowboys The process allowed for up to 1 million to be raised without some of the regulations being involved Companies under the then current proposal would have exemptions available and be able to raise capital from a larger pool of persons which can include lower thresholds for investor criteria whereas the old rules required that the person be an accredited investor These people are often recruited from social networks where the funds can be acquired from an equity purchase loan donation or ordering The amounts collected have become quite high with requests that are over a million dollars for software such as Trampoline Systems which used it to finance the commercialization of their new software citation needed Inducement prize contests Edit Web based idea competitions or inducement prize contests often consist of generic ideas cash prizes and an Internet based platform to facilitate easy idea generation and discussion An example of these competitions includes an event like IBM s 2006 Innovation Jam attended by over 140 000 international participants and yielded around 46 000 ideas 143 144 Another example is the Netflix Prize in 2009 People were asked to come up with a recommendation algorithm that is more accurate than Netflix s current algorithm It had a grand prize of US 1 000 000 and it was given to a team which designed an algorithm that beat Netflix s own algorithm for predicting ratings by 10 06 citation needed Another example of competition based crowdsourcing is the 2009 DARPA balloon experiment where DARPA placed 10 balloon markers across the United States and challenged teams to compete to be the first to report the location of all the balloons A collaboration of efforts was required to complete the challenge quickly and in addition to the competitive motivation of the contest as a whole the winning team MIT in less than nine hours established its own collaborapetitive environment to generate participation in their team 145 A similar challenge was the Tag Challenge funded by the US State Department which required locating and photographing individuals in five cities in the US and Europe within 12 hours based only on a single photograph The winning team managed to locate three suspects by mobilizing volunteers worldwide using a similar incentive scheme to the one used in the balloon challenge 146 Using open innovation platforms is an effective way to crowdsource people s thoughts and ideas for research and development The company InnoCentive is a crowdsourcing platform for corporate research and development where difficult scientific problems are posted for crowds of solvers to discover the answer and win a cash prize that ranges from 10 000 to 100 000 per challenge 17 InnoCentive of Waltham Massachusetts and London England provides access to millions of scientific and technical experts from around the world The company claims a success rate of 50 in providing successful solutions to previously unsolved scientific and technical problems The X Prize Foundation creates and runs incentive competitions offering between 1 million and 30 million for solving challenges Local Motors is another example of crowdsourcing and it is a community of 20 000 automotive engineers designers and enthusiasts that compete to build off road rally trucks 147 Implicit crowdsourcing Edit Implicit crowdsourcing is less obvious because users do not necessarily know they are contributing yet can still be very effective in completing certain tasks citation needed Rather than users actively participating in solving a problem or providing information implicit crowdsourcing involves users doing another task entirely where a third party gains information for another topic based on the user s actions 17 A good example of implicit crowdsourcing is the ESP game where users find words to describe Google images which are then used as metadata for the images Another popular use of implicit crowdsourcing is through reCAPTCHA which asks people to solve CAPTCHAs to prove they are human and then provides CAPTCHAs from old books that cannot be deciphered by computers to digitize them for the web Like many tasks solved using the Mechanical Turk CAPTCHAs are simple for humans but often very difficult for computers 127 Piggyback crowdsourcing can be seen most frequently by websites such as Google that data mine a user s search history and websites to discover keywords for ads spelling corrections and finding synonyms In this way users are unintentionally helping to modify existing systems such as Google Ads 45 Other types Edit Creative crowdsourcing involves sourcing people for creative projects such as graphic design crowdsourcing architecture product design 12 apparel design movies 148 writing company naming 149 illustration etc 150 151 While crowdsourcing competitions have been used for decades in some creative fields such as architecture creative crowdsourcing has proliferated with the recent development of web based platforms where clients can solicit a wide variety of creative work at lower cost than by traditional means citation needed Crowdshipping crowd shipping is a peer to peer shipping service usually conducted via an online platform or marketplace 152 There are several methods that have been categorized as crowd shipping Travelers heading in the direction of the buyer and are willing to bring the package as part of their luggage for a reward 153 Truck drivers whose route lies along the buyer s location and who are willing to take extra items in their truck 154 Community based platforms that connect international buyers and local forwarders by allowing buyers to use forwarder s address as purchase destination after which forwarders ship items further to the buyer 155 Crowdsolving is a collaborative and holistic way of solving a problem through many people communities groups or resources It is a type of crowdsourcing with focus on complex and intellectually demanding problems requiring considerable effort and the quality or uniqueness of contribution 156 Problem idea chains are a form of idea crowdsourcing and crowdsolving where individuals are asked to submit ideas to solve problems and then problems that can be solved with those ideas The aim is to find encourage individuals to find practical solutions to problems that are well thought through 157 Macrowork tasks typically have these characteristics they can be done independently they take a fixed amount of time and they require special skills Macro tasks could be part of specialized projects or could be part of a large visible project where workers pitch in wherever they have the required skills The key distinguishing factors are that macro work requires specialized skills and typically takes longer while microwork requires no specialized skills Microwork is a crowdsourcing platform that allows users to do small tasks for which computers lack aptitude in for low amounts of money Amazon s Mechanical Turk has created many different projects for users to participate in where each task requires very little time and offers a very small amount in payment 15 When choosing tasks since only certain users win users learn to submit later and pick less popular tasks to increase the likelihood of getting their work chosen 158 An example of a Mechanical Turk project is when users searched satellite images for a boat to find Jim Gray a missing computer scientist 127 Mobile crowdsourcing involves activities that take place on smartphones or mobile platforms that are frequently characterized by GPS technology 159 This allows for real time data gathering and gives projects greater reach and accessibility However mobile crowdsourcing can lead to an urban bias and can have safety and privacy concerns 160 161 162 Simple projects are those that require a large amount of time and skills compared to micro and macro work While an example of macro work would be writing survey feedback simple projects rather include activities like writing a basic line of code or programming a database which both require a larger time commitment and skill level These projects are usually not found on sites like Amazon Mechanical Turk and are rather posted on platforms like Upwork that call for a specific expertise 163 Complex projects generally take the most time have higher stakes and call for people with very specific skills These are generally one off projects that are difficult to accomplish and can include projects such as designing a new product that a company hopes to patent Such projects are considered to be complex because design is a meticulous process that requires a large amount of time to perfect and people completing the project must have specialized training in design to effectively complete the project These projects usually pay the highest yet are rarely offered 164 Demographics of the crowd EditThe crowd is an umbrella term for the people who contribute to crowdsourcing efforts Though it is sometimes difficult to gather data about the demographics of the crowd as a whole several studies have examined various specific online platforms Amazon Mechanical Turk has received a great deal of attention in particular A study in 2008 by Ipeirotis found that users at that time were primarily American young female and well educated with 40 earning more than 40 000 per year In November 2009 Ross found a very different Mechanical Turk population where 36 of which was Indian Two thirds of Indian workers were male and 66 had at least a bachelor s degree Two thirds had annual incomes less than 10 000 with 27 sometimes or always depending on income from Mechanical Turk to make ends meet 165 More recent studies have found that U S Mechanical Turk workers are approximately 58 female and nearly 67 of workers are in their 20s and 30s 46 166 167 168 Close to 80 are White and 9 are Black MTurk workers are less likely to be married or have children as compared to the general population In the US population over 18 45 are unmarried while the proportion of unmarried workers on MTurk is around 57 Additionally about 55 of MTurk workers do not have any children which is significantly higher than the general population Approximately 68 of U S workers are employed compared to 60 in the general population MTurk workers in the U S are also more likely to have a four year college degree 35 compared to the general population 27 Politics within the U S sample of MTurk are skewed liberal with 46 Democrats 28 Republicans and 26 other MTurk workers are also less religious than the U S population with 41 religious 20 spiritual 21 agnostic and 16 atheist The demographics of Microworkers com differ from Mechanical Turk in that the US and India together accounting for only 25 of workers 197 countries are represented among users with Indonesia 18 and Bangladesh 17 contributing the largest share However 28 of employers are from the US 169 Another study of the demographics of the crowd at iStockphoto found a crowd that was largely white middle to upper class higher educated worked in a so called white collar job and had a high speed Internet connection at home 170 In a crowd sourcing diary study of 30 days in Europe the participants were predominantly higher educated women 125 Studies have also found that crowds are not simply collections of amateurs or hobbyists Rather crowds are often professionally trained in a discipline relevant to a given crowdsourcing task and sometimes hold advanced degrees and many years of experience in the profession 170 171 172 173 Claiming that crowds are amateurs rather than professionals is both factually untrue and may lead to marginalization of crowd labor rights 174 Gregory Saxton et al studied the role of community users among other elements during his content analysis of 103 crowdsourcing organizations They developed a taxonomy of nine crowdsourcing models intermediary model citizen media production collaborative software development digital goods sales product design peer to peer social financing consumer report model knowledge base building model and collaborative science project model in which to categorize the roles of community users such as researcher engineer programmer journalist graphic designer etc and the products and services developed 175 Motivations Edit Further information Online participation Motivations Contributors Edit Many researchers suggest that both intrinsic and extrinsic motivations cause people to contribute to crowdsourced tasks and these factors influence different types of contributors 93 170 171 173 176 177 178 179 For example people employed in a full time position rate human capital advancement as less important than part time workers do while women rate social contact as more important than men do 177 Intrinsic motivations are broken down into two categories enjoyment based and community based motivations Enjoyment based motivations refer to motivations related to the fun and enjoyment contributors experience through their participation These motivations include skill variety task identity task autonomy direct feedback from the job and taking the job as a pastime citation needed Community based motivations refer to motivations related to community participation and include community identification and social contact In crowdsourced journalism the motivation factors are intrinsic the crowd is driven by a possibility to make social impact contribute to social change and help their peers 176 Extrinsic motivations are broken down into three categories immediate payoffs delayed payoffs and social motivations Immediate payoffs through monetary payment are the immediately received compensations given to those who complete tasks Delayed payoffs are benefits that can be used to generate future advantages such as training skills and being noticed by potential employers Social motivations are the rewards of behaving pro socially 180 such as the altruistic motivations of online volunteers Chandler and Kapelner found that US users of the Amazon Mechanical Turk were more likely to complete a task when told they were going to help researchers identify tumor cells than when they were not told the purpose of their task However of those who completed the task quality of output did not depend on the framing 181 Motivation in crowdsourcing is often a mix of intrinsic and extrinsic factors 182 In a crowdsourced law making project the crowd was motivated by both intrinsic and extrinsic factors Intrinsic motivations included fulfilling civic duty affecting the law for sociotropic reasons to deliberate with and learn from peers Extrinsic motivations included changing the law for financial gain or other benefits Participation in crowdsourced policy making was an act of grassroots advocacy whether to pursue one s own interest or more altruistic goals such as protecting nature 183 Participants in online research studies report their motivation as both intrinsic enjoyment and monetary gain 184 185 167 Another form of social motivation is prestige or status The International Children s Digital Library recruited volunteers to translate and review books Because all translators receive public acknowledgment for their contributions Kaufman and Schulz cite this as a reputation based strategy to motivate individuals who want to be associated with institutions that have prestige The Mechanical Turk uses reputation as a motivator in a different sense as a form of quality control Crowdworkers who frequently complete tasks in ways judged to be inadequate can be denied access to future tasks whereas workers who pay close attention may be rewarded by gaining access to higher paying tasks or being on an Approved List of workers This system may incentivize higher quality work 186 However this system only works when requesters reject bad work which many do not 187 Despite the potential global reach of IT applications online recent research illustrates that differences in location which affect participation outcomes in IT mediated crowds 188 Limitations and controversies EditAt least six major topics cover the limitations and controversies about crowdsourcing Impact of crowdsourcing on product quality Entrepreneurs contribute less capital themselves Increased number of funded ideas The value and impact of the work received from the crowd The ethical implications of low wages paid to workers Trustworthiness and informed decision makingImpact of crowdsourcing on product quality Edit Crowdsourcing allows anyone to participate allowing for many unqualified participants and resulting in large quantities of unusable contributions Companies or additional crowdworkers then have to sort through the low quality contributions The task of sorting through crowdworkers contributions along with the necessary job of managing the crowd requires companies to hire actual employees thereby increasing management overhead 189 For example susceptibility to faulty results can be caused by targeted malicious work efforts Since crowdworkers completing microtasks are paid per task a financial incentive often causes workers to complete tasks quickly rather than well 46 Verifying responses is time consuming so employers often depend on having multiple workers complete the same task to correct errors However having each task completed multiple times increases time and monetary costs 190 Some companies like CloudResearch control data quality by repeatedly vetting crowdworkers to ensure they are paying attention and providing high quality work 187 Crowdsourcing quality is also impacted by task design Lukyanenko et al 191 argue that the prevailing practice of modeling crowdsourcing data collection tasks in terms of fixed classes options unnecessarily restricts quality Results demonstrate that information accuracy depends on the classes used to model domains with participants providing more accurate information when classifying phenomena at a more general level which is typically less useful to sponsor organizations hence less common clarification needed Further greater overall accuracy is expected when participants could provide free form data compared to tasks in which they select from constrained choices In behavioral science research it is often recommended to include open ended responses in addition to other forms of attention checks to assess data quality 192 193 Just as limiting oftentimes there is not enough skills or expertise in the crowd to successfully accomplish the desired task While this scenario does not affect simple tasks such as image labeling it is particularly problematic for more complex tasks such as engineering design or product validation A comparison between the evaluation of business models from experts and an anonymous online crowd showed that an anonymous online crowd cannot evaluate business models to the same level as experts 194 In these cases it may be difficult or even impossible to find qualified people in the crowd as their responses represent only a small fraction of the workers compared to consistent but incorrect crowd members 195 However if the task is intermediate in its difficulty estimating crowdworkers skills and intentions and leveraging them for inferring true responses works well 196 albeit with an additional computation cost citation needed Crowdworkers are a nonrandom sample of the population Many researchers use crowdsourcing to quickly and cheaply conduct studies with larger sample sizes than would be otherwise achievable However due to limited access to the Internet participation in low developed countries is relatively low Participation in highly developed countries is similarly low largely because the low amount of pay is not a strong motivation for most users in these countries These factors lead to a bias in the population pool towards users in medium developed countries as deemed by the human development index 197 Participants in these countries sometimes masquerade as U S participants to gain access to certain tasks This led to the bot scare on Amazon Mechanical Turk in 2018 when researchers thought bots were completing research surveys due to the lower quality of responses originating from medium developed countries 193 198 The likelihood that a crowdsourced project will fail due to lack of monetary motivation or too few participants increases over the course of the project Tasks that are not completed quickly may be forgotten buried by filters and search procedures This results in a long tail power law distribution of completion times 199 Additionally low paying research studies online have higher rates of attrition with participants not completing the study once started 47 Even when tasks are completed crowdsourcing does not always produce quality results When Facebook began its localization program in 2008 it encountered some criticism for the low quality of its crowdsourced translations 200 One of the problems of crowdsourcing products is the lack of interaction between the crowd and the client Usually little information is known about the final product and workers rarely interacts with the final client in the process This can decrease the quality of product as client interaction is considered to be a vital part of the design process 201 An additional cause of the decrease in product quality that can result from crowdsourcing is the lack of collaboration tools In a typical workplace coworkers are organized in such a way that they can work together and build upon each other s knowledge and ideas Furthermore the company often provides employees with the necessary information procedures and tools to fulfill their responsibilities However in crowdsourcing crowd workers are left to depend on their own knowledge and means to complete tasks 189 A crowdsourced project is usually expected to be unbiased by incorporating a large population of participants with a diverse background However most of the crowdsourcing works are done by people who are paid or directly benefit from the outcome e g most of open source projects working on Linux In many other cases the end product is the outcome of a single person s endeavor who creates the majority of the product while the crowd only participates in minor details 202 Entrepreneurs contribute less capital themselves Edit To make an idea turn into a reality the first component needed is capital Depending on the scope and complexity of the crowdsourced project the amount of necessary capital can range from a few thousand dollars to hundreds of thousands if not more The capital raising process can take from days to months depending on different variables including the entrepreneur s network and the amount of initial self generated capital citation needed The crowdsourcing process allows entrepreneurs to access a wide range of investors who can take different stakes in the project 203 As an effect crowdsourcing simplifies the capital raising process and allows entrepreneurs to spend more time on the project itself and reaching milestones rather than dedicating time to get it started Overall the simplified access to capital can save time to start projects and potentially increase the efficiency of projects citation needed Others argue that easier access to capital through a large number of smaller investors can hurt the project and its creators With a simplified capital raising process involving more investors with smaller stakes investors are more risk seeking because they can take on an investment size with which they are comfortable 203 This leads to entrepreneurs losing possible experience convincing investors who are wary of potential risks in investing because they do not depend on one single investor for the survival of their project Instead of being forced to assess risks and convince large institutional investors on why their project can be successful wary investors can be replaced by others who are willing to take on the risk Some translation companies and translation tool consumers pretend to use crowdsourcing as a means for drastically cutting costs instead of hiring professional translators This situation has been systematically denounced by IAPTI and other translator organizations 204 Increased number of funded ideas Edit The raw number of ideas that get funded and the quality of the ideas is a large controversy over the issue of crowdsourcing Proponents argue that crowdsourcing is beneficial because it allows the formation of startups with niche ideas that would not survive venture capitalist or angel funding which areoftentimes the primary investors in startups Many ideas are scrapped in their infancy due to insufficient support and lack of capital but crowdsourcing allows these ideas to be started if an entrepreneur can find a community to take interest in the project 205 Crowdsourcing allows those who would benefit from the project to fund and become a part of it which is one way for small niche ideas get started 206 However when the number of projects grows the number of failures also increases Crowdsourcing assists the development of niche and high risk projects due to a perceived need from a select few who seek the product With high risk and small target markets the pool of crowdsourced projects faces a greater possible loss of capital lower return and lower levels of success 207 Concerns Edit Because crowdworkers are considered independent contractors rather than employees they are not guaranteed minimum wage In practice workers using Amazon Mechanical Turk generally earn less than minimum wage In 2009 it was reported that United States Turk users earned an average of 2 30 per hour for tasks while users in India earned an average of 1 58 per hour which is below minimum wage in the United States but not in India 165 208 In 2018 a survey of 2 676 Amazon Mechanical Turk workers doing 3 8 million tasks found that the median hourly wage was approximately 2 per hour and only 4 of workers earned more than the federal minimum wage of 7 25 per hour 209 Some researchers who have considered using Mechanical Turk to get participants for research studies have argued that the wage conditions might be unethical 47 210 However according to other research workers on Amazon Mechanical Turk do not feel they are exploited and are ready to participate in crowdsourcing activities in the future 211 A more recent study using stratified random sampling to access a representative sample of Mechanical Turk workers found that the U S MTurk population is financially similar to the general population 167 Workers tend to participate in tasks as a form of paid leisure and to supplement their primary income and only 7 view it as a full time job Overall workers rated MTurk as less stressful than other jobs Workers also earn more than previously reported about 6 50 per hour They see MTurk as part of the solution to their financial situation and report rare upsetting experiences They also perceive requesters on MTurk as fairer and more honest than employers outside of the platform 167 When Facebook began its localization program in 2008 it received criticism for using free labor in crowdsourcing the translation of site guidelines 200 Typically no written contracts nondisclosure agreements or employee agreements are made with crowdworkers For users of the Amazon Mechanical Turk this means that employers decide whether users work is acceptable and reserve the right to withhold pay if it does not meet their standards 212 Critics say that crowdsourcing arrangements exploit individuals in the crowd and a call has been made for crowds to organize for their labor rights 213 174 214 Collaboration between crowd members can also be difficult or even discouraged especially in the context of competitive crowd sourcing Crowdsourcing site InnoCentive allows organizations to solicit solutions to scientific and technological problems only 10 6 of respondents reported working in a team on their submission 171 Amazon Mechanical Turk workers collaborated with academics to create a platform WeAreDynamo org that allows them to organize and create campaigns to better their work situation but unfortunately the site is no longer running 215 Another platform run by Amazon Mechanical Turk workers and academics Turkopticon continues to operate and provides worker reviews on Amazon Mechanical Turk employers 216 America Online settled the case Hallissey et al v America Online Inc for 15 million in 2009 after unpaid moderators sued to be paid the minimum wage as employees under the U S Fair Labor Standards Act See also EditCitizen science Scientific research conducted in whole or in part by amateur or nonprofessional scientists Clickworkers Citizen science project by NASA Collaborative innovation network citizen science projectPages displaying wikidata descriptions as a fallback Collaborative mapping Aggregation of web mapping and user content Collective consciousness Shared beliefs and ideas in society Collective intelligence Group intelligence that emerges from collective efforts Collective problem solving Approaches to problem solvingPages displaying short descriptions of redirect targets Commons based peer production Method of producing value Crowd computing Crowdcasting Intersection of broadcasting and crowdsourcing Crowdfixing Crowdsourcing software development Distributed thinking Computer Science TechniquePages displaying wikidata descriptions as a fallback Distributed Proofreaders Web based proofreading project Flash mob Form of assembling humans Folksonomy Gamification Using game design elements in non games Government crowdsourcing List of crowdsourcing projects Models of collaborative tagging Microcredit Small loans to impoverished borrowers Participatory democracy Model of democracy Participatory monitoring Smart mob Digital communication coordinated group Social collaboration Stone Soup European folk story Truecaller Mobile phone application Virtual collective consciousness in behavioral sciencePages displaying wikidata descriptions as a fallback Virtual volunteering Volunteering conducted at least partially via the internet Wisdom of the crowd Collective perception of a group of people Wiki survey Survey method for crowdsourcing opinions Crowdsource app Crowdsourcing platform developed by GoogleReferences Edit Schenk Eric Guittard Claude 1 January 2009 Crowdsourcing What can be Outsourced to the Crowd and Why Center for Direct Scientific Communication Retrieved 1 October 2018 via HAL a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Hirth Matthias Hossfeld Tobias Tran Gia Phuoc 2011 Anatomy of a Crowdsourcing Platform Using the Example of Microworkers com PDF 2011 Fifth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing pp 322 329 doi 10 1109 IMIS 2011 89 ISBN 978 1 61284 733 7 S2CID 12955095 Archived from the original PDF on 22 November 2015 Retrieved 5 September 2015 a b Estelles Arolas Enrique Gonzalez Ladron de Guevara Fernando 2012 Towards an Integrated Crowdsourcing Definition PDF Journal of Information Science 38 2 189 200 doi 10 1177 0165551512437638 hdl 10251 56904 S2CID 18535678 Brabham D C 2013 Crowdsourcing Cambridge Massachusetts London England The MIT Press Brabham D C 2008 Crowdsourcing as a Model for Problem Solving an Introduction and Cases Convergence The International Journal of Research into New Media Technologies 14 1 75 90 CiteSeerX 10 1 1 175 1623 doi 10 1177 1354856507084420 S2CID 145310730 Prpic J amp Shukla P 2016 Crowd Science Measurements Models and Methods In Proceedings of the 49th Annual Hawaii International Conference on System Sciences Kauai Hawaii IEEE Computer Society Buettner Ricardo 2015 A Systematic Literature Review of Crowdsourcing Research from a Human Resource Management Perspective 48th Annual Hawaii International Conference on System Sciences Kauai Hawaii IEEE pp 4609 4618 doi 10 13140 2 1 2061 1845 ISBN 978 1 4799 7367 5 a b Prpic John Taeihagh Araz Melton James September 2015 The Fundamentals of Policy Crowdsourcing Policy amp Internet 7 3 340 361 arXiv 1802 04143 doi 10 1002 poi3 102 S2CID 3626608 Afuah A Tucci C L 2012 Crowdsourcing as a Solution to Distant Search Academy of Management Review 37 3 355 375 doi 10 5465 amr 2010 0146 de Vreede T Nguyen C de Vreede G J Boughzala I Oh O amp Reiter Palmon R 2013 A Theoretical Model of User Engagement in Crowdsourcing In Collaboration and Technology pp 94 109 Springer Berlin Heidelberg Sarin Supheakmungkol Pipatsrisawat Knot Pham Khiem Batra Anurag Valente Luis 2019 Crowdsource by Google A Platform for Collecting Inclusive and Representative Machine Learning Data PDF AAAI Hcomp 2019 a b Liu Wei Moultrie James Ye Songhe 4 May 2019 The Customer Dominated Innovation Process Involving Customers as Designers and Decision Makers in Developing New Product The Design Journal 22 3 299 324 doi 10 1080 14606925 2019 1592324 ISSN 1460 6925 S2CID 145931864 Schlagwein Daniel Bjorn Andersen Niels 2014 Organizational Learning with Crowdsourcing The Revelatory Case of LEGO PDF Journal of the Association for Information Systems 15 11 754 778 doi 10 17705 1jais 00380 Taeihagh Araz 19 June 2017 Crowdsourcing Sharing Economies and Development Journal of Developing Societies 33 2 0169796X1771007 arXiv 1707 06603 doi 10 1177 0169796x17710072 S2CID 32008949 a b c Howe Jeff 2006 The Rise of Crowdsourcing Wired Howe Jeff 2 June 2006 Crowdsourcing A Definition Crowdsourcing Blog Retrieved 2 January 2013 a b c d e Brabham Daren 2008 Crowdsourcing as a Model for Problem Solving An Introduction and Cases PDF Convergence The International Journal of Research into New Media Technologies 14 1 75 90 CiteSeerX 10 1 1 175 1623 doi 10 1177 1354856507084420 S2CID 145310730 archived from the original PDF on 2 August 2012 a b Guth Kristen L Brabham Daren C 4 August 2017 Finding the diamond in the rough Exploring communication and platform in crowdsourcing performance Communication Monographs 84 4 510 533 doi 10 1080 03637751 2017 1359748 ISSN 0363 7751 S2CID 54045924 a b c d e f g h A Brief History of Crowdsourcing Infographic Crowdsourcing org 18 March 2012 Archived from the original on 3 July 2015 Retrieved 2 July 2015 Hern Chester G 2002 Tracks in the Sea p 123 amp 246 McGraw Hill ISBN 0 07 136826 4 Smithsonian Crowdsourcing Since 1849 Smithsonian Institution Archives 14 April 2011 Retrieved 24 August 2018 Clark Catherine E 25 April 1970 C etait Paris en 1970 Etudes Photographiques 31 Retrieved 2 July 2015 Axelrod R 1980 Effective choice in the Prisoner s Dilemma Journal of Conflict Resolution 24 1 3 25 doi 10 1177 002200278002400101 S2CID 143112198 UNV Online Volunteering Service History Onlinevolunteering org Archived from the original on 2 July 2015 Retrieved 2 July 2015 Wired 14 06 The Rise of Crowdsourcing Archive wired com 4 January 2009 Retrieved 2 July 2015 Lih Andrew 2009 The Wikipedia revolution how a bunch of nobodies created the world s greatest encyclopedia 1st ed New York Hyperion ISBN 978 1401303716 Lakhani KR Garvin DA Lonstein E January 2010 TopCoder A Developing Software through Crowdsourcing Harvard Business School Case 610 032 Phadnisi Shilpa 21 October 2016 Appirio s TopCoder too is a big catch for Wipro The Times of India Retrieved 30 April 2018 a b For The Love Of Open Mapping Data TechCrunch 9 August 2014 Retrieved 23 July 2019 a b Crowdsourcing Back Up Timeline Early Stories Archived from the original on 29 November 2014 better source needed Amazon Mechanical Turk www mturk com Retrieved 25 November 2022 Garrigos Simon Fernando J Gil Pechuan Ignacio Estelles Miguel Sofia 2015 Advances in Crowdsourcing Springer ISBN 9783319183411 Antoine Jean Baptiste Robert Auget Baron de Montyon New Advent Retrieved 25 February 2012 It Was All About Alkali Chemistry Chronicles Retrieved 25 February 2012 Nicolas Appert John Blamire Retrieved 25 February 2012 9 Examples of Crowdsourcing Before Crowdsourcing Existed MemeBurn 15 September 2011 Retrieved 25 February 2012 Pande Shamni 25 May 2013 The People Know Best Business Today India Living Media India Limited Noveck Beth Simone 2009 Wiki Government How Technology Can Make Government Better Democracy Stronger and Citizens More Powerful Brookings Institution Press Sarasua Cristina Simperl Elena Noy Natalya F 2012 Crowdsourcing Ontology Alignment with Microtasks PDF Institute AIFB Karlsruhe Institute of Technology 2 Hollow Matthew 20 April 2013 Crowdfunding and Civic Society in Europe A Profitable Partnership Open Citizenship Retrieved 29 April 2013 Federal Transit Administration Public Transportation Participation Pilot Program U S Department of Transportation archived from the original on 7 January 2009 Peer to Patent Community Patent Review Project Peer to Patent Community Patent Review Project Callison Burch C Dredze M 2010 Creating Speech and Language Data With Amazon s Mechanical Turk PDF Human Language Technologies Conference 1 12 archived from the original PDF on 2 August 2012 retrieved 28 February 2012 McGraw I Seneff S 2011 Growing a Spoken Language Interface on Amazon Mechanical Turk PDF Interspeech 3057 3060 doi 10 21437 Interspeech 2011 765 a b Kittur A Chi E H Sun B 2008 Crowdsourcing user studies with Mechanical Turk PDF Chi 2008 a b c d Litman Leib Robinson Jonathan 2020 Conducting Online Research on Amazon Mechanical Turk and Beyond SAGE Publications ISBN 978 1506391137 a b c Mason W Suri S 2010 Conducting Behavioral Research on Amazon s Mechanical Turk Behavior Research Methods SSRN 1691163 Koblin A 2008 The sheep market Creativity and Cognition 451 452 doi 10 1145 1640233 1640348 ISBN 9781605588650 S2CID 20609292 explodingdog 2015 Explodingdog com Retrieved 2 July 2015 DeVun Leah 19 November 2009 Looking at how crowds produce and present art Wired News Archived from the original on 24 October 2012 Retrieved 26 February 2012 Linver D 2010 Crowdsourcing and the Evolving Relationship between Art and Artist archived from the original on 14 July 2014 retrieved 28 February 2012 Why INRIX com 13 September 2014 Archived from the original on 12 October 2014 Retrieved 2 July 2015 Vergano Dan 30 August 2014 1833 Meteor Storm Started Citizen Science National Geographic StarStruck Retrieved 18 September 2014 Gateway to Astronaut Photography of Earth NASA McLaughlin Elliot Image Overload Help us sort it all out NASA requests Cnn com CNN Retrieved 18 September 2014 Liu Huiying Xie Qian Wen Lou Vivian W Q 1 April 2019 Everyday social interactions and intra individual variability in affect A systematic review and meta analysis of ecological momentary assessment studies Motivation and Emotion 43 2 339 353 doi 10 1007 s11031 018 9735 x ISSN 1573 6644 S2CID 254827087 Luong Raymond Lomanowska Anna M 2021 Evaluating Reddit as a Crowdsourcing Platform for Psychology Research Projects Teaching of Psychology 49 4 329 337 doi 10 1177 00986283211020739 ISSN 0098 6283 S2CID 236414676 Brown Joshua K Hohman Zachary P 2022 Extreme party animals Effects of political identification and ideological extremity Journal of Applied Social Psychology 52 5 351 362 doi 10 1111 jasp 12863 ISSN 0021 9029 S2CID 247077069 Vaterlaus J Mitchell Patten Emily V Spruance Lori A 26 May 2022 Alonetogether An Exploratory Study of Social Media Use at the Beginning of the COVID 19 Pandemic The Journal of Social Media in Society 11 1 27 45 ISSN 2325 503X Despres Jacques Hadjsaid Nouredine Criqui Patrick Noirot Isabelle 1 February 2015 Modelling the impacts of variable renewable sources on the power sector reconsidering the typology of energy modelling tools Energy 80 486 495 doi 10 1016 j energy 2014 12 005 ISSN 0360 5442 OpenEI Energy Information Data and other Resources OpenEI Retrieved 26 September 2016 Garvin Peggy 12 December 2009 New Gateway Open Energy Info SLA Government Information Division Dayton Ohio USA Retrieved 26 September 2016 permanent dead link Brodt Giles Debbie 2012 WREF 2012 OpenEI an open energy data and information exchange for international audiences PDF Golden Colorado USA National Renewable Energy Laboratory NREL Archived from the original PDF on 9 October 2016 Retrieved 24 September 2016 Davis Chris Chmieliauskas Alfredas Dijkema Gerard Nikolic Igor Enipedia Delft The Netherlands Energy and Industry group Faculty of Technology Policy and Management TU Delft Archived from the original on 10 June 2014 Retrieved 7 October 2016 a href Template Cite web html title Template Cite web cite web a CS1 maint unfit URL link Davis Chris 2012 Making sense of open data from raw data to actionable insight PhD thesis Delft The Netherlands Delft University of Technology Retrieved 2 October 2018 Chapter 9 discusses in depth the initial development of Enipedia What Is the Four Generation Program The Church of Jesus Christ of Latter day Saints Retrieved 30 January 2012 King Turi E Jobling Mark A 2009 What s in a name Y chromosomes surnames and the genetic genealogy revolution Trends in Genetics 25 8 351 60 doi 10 1016 j tig 2009 06 003 hdl 2381 8106 PMID 19665817 The International Society of Genetic Genealogy www isogg org advocates the use of genetics as a tool for genealogical research and provides a support network for genetic genealogists It hosts the ISOGG Y haplogroup tree which has the virtue of being regularly updated Mendex etc al Fernando 28 February 2013 An African American Paternal Lineage Adds an Extremely Ancient Root to the Human Y Chromosome Phylogenetic Tree The American Journal of Human Genetics 92 3 454 459 doi 10 1016 j ajhg 2013 02 002 PMC 3591855 PMID 23453668 Wells Spencer 2013 The Genographic Project and the Rise of Citizen Science Southern California Genealogical Society SCGS Archived from the original on 10 July 2013 Retrieved 10 July 2013 History of the Christmas Bird Count Audubon Birds audubon org 22 January 2015 Retrieved 2 July 2015 Thank you Audubon 5 October 2017 Archived from the original on 24 August 2014 Home ISCRAM2015 University of Agder PDF iscram2015 uia no Archived from the original PDF on 17 October 2016 Retrieved 14 October 2016 Aitamurto Tanja 2015 Motivation Factors in Crowdsourced Journalism Social Impact Social Change and Peer Learning International Journal of Communication 9 3523 3543 Aitamurto Tanja 2016 Crowdsourcing as a Knowledge Search Method in Digital Journalism Ruptured Ideals and Blended Responsibility Digital Journalism 4 2 280 297 doi 10 1080 21670811 2015 1034807 S2CID 156243124 Aitamurto Tanja 2013 Balancing between open and closed co creation in magazine journalism Digital Journalism 1 2 229 251 doi 10 1080 21670811 2012 750150 S2CID 62882093 Algorithm Watch Algorithm Watch 2022 Retrieved 18 May 2022 Overview in English DataSkop 2022 Retrieved 18 May 2022 FAQs Mozilla Rally Retrieved 14 March 2023 Mozilla Rally is currently available to US residents who are age 19 and older It s your data Use it for change Mozilla Rally Retrieved 14 March 2023 Angus Daniel 16 February 2022 A data economy the case for doing and knowing more about algorithms Crikey Retrieved 24 March 2022 Burgess Jean Angus Daniel Carah Nicholas Andrejevic Mark Hawker Kiah Lewis Kelly Obeid Abdul Smith Adam Tan Jane Fordyce Robbie Trott Verity 8 November 2021 Critical simulation as hybrid digital method for exploring the data operations and vernacular cultures of visual social media platforms SocArXiv doi 10 31235 osf io 2cwsu S2CID 243837581 The Markup 2022 The Citizen Browser Project Auditing the Algorithms of Disinformation The Markup Retrieved 18 May 2022 Smith Graham Richards Robert C Gastil John 12 May 2015 The Potential ofParticipediaas a Crowdsourcing Tool for Comparative Analysis of Democratic Innovations PDF Policy amp Internet 7 2 243 262 doi 10 1002 poi3 93 ISSN 1944 2866 Moon M Jae 2018 Evolution of co production in the information age crowdsourcing as a model of web based co production in Korea Policy and Society 37 3 294 309 doi 10 1080 14494035 2017 1376475 ISSN 1449 4035 S2CID 158440300 Taeihagh Araz 8 November 2017 Crowdsourcing a new tool for policy making Policy Sciences 50 4 629 647 arXiv 1802 03113 doi 10 1007 s11077 017 9303 3 ISSN 0032 2687 S2CID 27696037 Diamond Larry Whittington Zak 2009 Social Media In Welzel Christian Haerpfer Christian W Bernhagen Patrick Inglehart Ronald F eds Democratization 2 ed Oxford Oxford University Press published 2018 p 256 ISBN 9780198732280 Retrieved 4 March 2021 Another way that social media can contribute to democratization is by crowdsourcing information This elicits the knowledge and wisdom of the crowd Aitamurto Tanja 2012 Crowdsourcing for Democracy New Era In Policy Making Committee for the Future Parliament of Finland pp 10 30 ISBN 978 951 53 3459 6 Prpic J Taeihagh A Melton J 2014 Crowdsourcing the Policy Cycle Collective Intelligence 2014 MIT Center for Collective Intelligence PDF Humancomputation com Archived from the original PDF on 24 June 2015 Retrieved 2 July 2015 Prpic J Taeihagh A Melton J 2014 A Framework for Policy Crowdsourcing Oxford Internet Institute University of Oxford IPP 2014 Crowdsourcing for Politics and Policy PDF Ipp oxii ox ac uk Retrieved 2 October 2018 Prpic J Taeihagh A Melton J 2014 Experiments on Crowdsourcing Policy Assessment Oxford Internet Institute University of Oxford IPP 2014 Crowdsourcing for Politics and Policy PDF Ipp oii ox ac uk Retrieved 2 July 2015 Thapa B Niehaves B Seidel C Plattfaut R 2015 Citizen involvement in public sector innovation Government and citizen perspectives Information Polity 20 1 3 17 doi 10 3233 IP 150351 Aitamurto and Landemore 4 February 2015 Five design principles for crowdsourced policymaking Assessing the case of crowdsourced off road traffic law reform in Finland Journal of Social Media for Organizations 1 1 19 a b Aitamurto Landemore Galli 2016 Unmasking the Crowd Participants Motivation Factors Profile and Expectations for Participation in Crowdsourced Policymaking Information Communication amp Society 20 8 1239 1260 doi 10 1080 1369118x 2016 1228993 S2CID 151989757 via Routledge a href Template Cite journal html title Template Cite journal cite journal a CS1 maint multiple names authors list link Aitamurto Chen Cherif Galli and Santana 2016 Making Sense of Crowdsourced Civic Data with Big Data Tools ACM Digital Archive Academic Mindtrek doi 10 1145 2994310 2994366 S2CID 16855773 via ACM Digital Archive a href Template Cite journal html title Template Cite journal cite journal a CS1 maint multiple names authors list link a b c Aitamurto Tanja 31 January 2015 Crowdsourcing for Democracy New Era in Policymaking Committee for the Future Parliament of Finland ISBN 978 951 53 3459 6 Home challenge gov Stan Nussbaum 2003 Proverbial perspectives on pluralism Connections the journal of the WEA Missions Committee October pp 30 31 Oromo dictionary project OromoDictionary com Retrieved 3 February 2014 Albright Eric Hatton John 2007 Chapter 10 WeSay a Tool for Engaging Native Speakers in Dictionary Building hdl 10125 1368 ISBN 978 0 8248 3309 1 Developing ASL vocabulary for science and math Washington edu 7 December 2012 Retrieved 3 February 2014 Keuleers et al February 2015 Word knowledge in the crowd Measuring vocabulary size and word prevalence in a massive online experiment Quarterly Journal of Experimental Psychology 68 8 1665 1692 doi 10 1080 17470218 2015 1022560 PMID 25715025 S2CID 4894686 a b Bill Jeremiah Gong He Hamilton Brooke Hawthorn Henry et al The extension of positive anymore Google Docs Retrieved 27 September 2020 Pashto Proverb Collection project AfghanProverbs com Archived from the original on 4 February 2014 Retrieved 3 February 2014 Comparing methods of collecting proverbs PDF gial edu Edward Zellem 2014 Mataluna 151 Afghan Pashto Proverbs Tampa Florida Culture Direct Zhai Haijun Lingren Todd Deleger Louise Li Qi Kaiser Megan Stoutenborough Laura Solti Imre 2013 Web 2 0 based crowdsourcing for high quality gold standard development in clinical Natural Language Processing Journal of Medical Internet Research 15 4 e73 doi 10 2196 jmir 2426 PMC 3636329 PMID 23548263 Martin Fred Resnick Mitchel 1993 LEGO Logo and Electronic Bricks Creating a Scienceland for Children Advanced Educational Technologies for Mathematics and Science Berlin Heidelberg Springer Berlin Heidelberg pp 61 89 doi 10 1007 978 3 662 02938 1 2 ISBN 978 3 642 08152 1 retrieved 26 July 2022 Reinhold Stephan Dolnicar Sara December 2017 How Airbnb Creates Value Peer to Peer Accommodation Networks Goodfellow Publishers doi 10 23912 9781911396512 3602 ISBN 9781911396512 retrieved 26 July 2022 Prime Panels by CloudResearch Online Research Panel Recruitment CloudResearch Retrieved 12 January 2023 Nunan Daniel 2020 Marketing research applied insight David F Birks Naresh K Malhotra 6th ed Harlow United Kingdom ISBN 978 1 292 30872 2 OCLC 1128061550 Parker Christopher J May Andrew Mitchell Val November 2013 The role of VGI and PGI in supporting outdoor activities Applied Ergonomics 44 6 886 894 doi 10 1016 j apergo 2012 04 013 ISSN 0003 6870 PMID 22795180 S2CID 12918341 Parker Christopher J May Andrew Mitchell Val 15 May 2014 User centred design of neogeography the impact of volunteered geographic information on users perceptions of online map mashups Ergonomics 57 7 987 997 doi 10 1080 00140139 2014 909950 ISSN 0014 0139 PMID 24827070 S2CID 13458260 Brown Michael Sharples Sarah Harding Jenny Parker Christopher J November 2013 Usability of Geographic Information Current challenges and future directions PDF Applied Ergonomics 44 6 855 865 doi 10 1016 j apergo 2012 10 013 PMID 23177775 S2CID 26412254 Archived from the original PDF on 19 July 2018 Retrieved 20 August 2019 Parker Christopher J May Andrew Mitchell Val August 2012 Understanding Design with VGI using an Information Relevance Framework Transactions in GIS 16 4 545 560 doi 10 1111 j 1467 9671 2012 01302 x ISSN 1361 1682 S2CID 20100267 Holley Rose March 2010 Crowdsourcing How and Why Should Libraries Do It D Lib Magazine 16 3 4 doi 10 1045 march2010 holley Retrieved 21 May 2021 Trant Jennifer 2009 Tagging Folksonomy and Art Museums Results of steve museum s research PDF Archives amp Museum Informatics Archived from the original PDF on 10 February 2010 Retrieved 21 May 2021 Andro M 2018 Digital libraries and crowdsourcing Wiley ISTE ISBN 9781786301611 Rahman Mahbubur Blackwell Brenna Banerjee Nilanjan Dharmendra Saraswat 2015 Smartphone based hierarchical crowdsourcing for weed identification Computers and Electronics in Agriculture 113 14 23 doi 10 1016 j compag 2014 12 012 retrieved 12 August 2015 Primarily on the Bridge Winners website Tang Weiming Han Larry Best John Zhang Ye Mollan Katie Kim Julie Liu Fengying Hudgens Michael Bayus Barry 1 June 2016 Crowdsourcing HIV Test Promotion Videos A Noninferiority Randomized Controlled Trial in China Clinical Infectious Diseases 62 11 1436 1442 doi 10 1093 cid ciw171 ISSN 1537 6591 PMC 4872295 PMID 27129465 a b Zhang Ye Kim Julie A Liu Fengying Tso Lai Sze Tang Weiming Wei Chongyi Bayus Barry L Tucker Joseph D November 2015 Creative Contributory Contests to Spur Innovation in Sexual Health 2 Cases and a Guide for Implementation Sexually Transmitted Diseases 42 11 625 628 doi 10 1097 OLQ 0000000000000349 ISSN 1537 4521 PMC 4610177 PMID 26462186 Crequit Perrine 2018 Mapping of Crowdsourcing in Health Systematic Review Journal of Medical Internet Research 20 5 e187 doi 10 2196 jmir 9330 PMC 5974463 PMID 29764795 van der Krieke et al 2015 HowNutsAreTheDutch HoeGekIsNL A crowdsourcing study of mental symptoms and strengths PDF International Journal of Methods in Psychiatric Research 25 2 123 144 doi 10 1002 mpr 1495 PMC 6877205 PMID 26395198 Prpic J 2015 Health Care Crowds Collective Intelligence in Public Health Collective Intelligence 2015 Center for the Study of Complex Systems University of Michigan Papers ssrn com SSRN 2570593 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help a b van der Krieke L Blaauw FJ Emerencia AC Schenk HM Slaets JP Bos EH de Jonge P Jeronimus BF 2016 Temporal Dynamics of Health and Well Being A Crowdsourcing Approach to Momentary Assessments and Automated Generation of Personalized Feedback 2016 PDF Psychosomatic Medicine 79 2 213 223 doi 10 1097 PSY 0000000000000378 PMID 27551988 S2CID 10955232 Ess Henk van Crowdsourcing how to find a crowd ARD ZDF Akademie 2010 Berlin p 99 a b c Doan A Ramarkrishnan R Halevy A 2011 Crowdsourcing Systems on the World Wide Web PDF Communications of the ACM 54 4 86 96 doi 10 1145 1924421 1924442 S2CID 207184672 Brabham Daren C 2013 Crowdsourcing MIT Press p 45 Blohm Ivo Zogaj Shkodran Bretschneider Ulrich Leimeister Jan Marco 2018 How to Manage Crowdsourcing Platforms Effectively PDF California Management Review 60 2 122 149 doi 10 1177 0008125617738255 S2CID 73551209 Howe Jeff 2008 Crowdsourcing Why the Power of the Crowd is Driving the Future of Business PDF The International Achievement Institute archived from the original PDF on 23 September 2015 retrieved 9 April 2012 Crowdvoting How Elo Limits Disruption thevisionlab com 25 May 2017 Robson John 24 February 2012 IEM Demonstrates the Political Wisdom of Crowds Canoe ca Archived from the original on 7 April 2012 Retrieved 31 March 2012 4 Great Examples of Crowdsourcing through Social Media digitalagencymarketing com 2012 Archived from the original on 1 April 2012 Retrieved 29 March 2012 Goldberg Ken Newsom Gavin 12 June 2014 Let s amplify California s collective intelligence Citris uc org Retrieved 14 June 2014 Escoffier N and B McKelvey 2014 Using Crowd Wisdom Strategy to Co Create Market Value Proof of Concept from the Movie Industry in International Perspective on Business Innovation and Disruption in the Creative Industries Film Video Photography P Wikstrom and R DeFillippi eds UK Edward Elgar Publishing Ltd Chap 11 Block A B 21 April 2010 How boxoffice trading could flop The Hollywood Reporter Chen A and R Panaligan 2013 Quantifying movie magic with Google search Google White Paper Industry Perspectives User Insights Williams Jack 17 February 2017 An Indoor Football Team Has Its Fans Call the Plays The New York Times ISSN 0362 4331 Retrieved 7 February 2018 Prive Tanya What Is Crowdfunding And How Does It Benefit The Economy Forbes com Retrieved 2 July 2015 Choy Katherine Schlagwein Daniel 2016 Crowdsourcing for a better world On the relation between IT affordances and donor motivations in charitable crowdfunding Information Technology amp People 29 1 221 247 doi 10 1108 ITP 09 2014 0215 Barnett Chance Crowdfunding Sites In 2014 Forbes com Retrieved 2 July 2015 a b c Agrawal Ajay Christian Catalini and Avi Goldfarb Some Simple Economics of Crowdfunding National Bureau of Economic Research 2014 63 97 Leimeister J M Huber M Bretschneider U Krcmar H 2009 Leveraging Crowdsourcing Activation Supporting Components for IT Based Ideas Competition Journal of Management Information Systems 26 1 197 224 doi 10 2753 mis0742 1222260108 S2CID 17485373 Ebner W Leimeister J Krcmar H 2009 Community Engineering for Innovations The Ideas Competition as a method to nurture a Virtual Community for Innovations R amp D Management 39 4 342 356 doi 10 1111 j 1467 9310 2009 00564 x S2CID 16316321 dead link DARPA Network Challenge DARPA Network Challenge Archived from the original on 11 August 2011 Retrieved 28 November 2011 Social media web snares criminals New Scientist Retrieved 4 April 2012 Beyond XPrize The 10 Best Crowdsourcing Tools and Technologies 20 February 2012 Retrieved 30 March 2012 Cunard C 2010 The Movie Research Experience gets audiences involved in filmmaking The Daily Bruin 19 July MacArthur Kate Squadhelp wants your company to crowdsource better names and avoid Boaty McBoatface chicagotribune com Retrieved 28 August 2017 Compete To Create Your Dream Home FastCoexist com 4 June 2013 Retrieved 3 February 2014 Designers clients forge ties on web Boston Herald 11 June 2012 Retrieved 3 February 2014 Dolan Shelagh Crowdsourced delivery explained making same day shipping cheaper through local couriers Business Insider archived from the original on 22 May 2018 retrieved 21 May 2018 Murison Malek 19 April 2018 LivingPackets uses IoT crowdshipping to transform deliveries Internet of Business retrieved 19 April 2018 Biller David Sciaudone Christina 19 June 2018 Goldman Sachs Soros Bet on the Uber of Brazilian Trucking Bloomberg retrieved 11 March 2019 Tyrsina Radu Parcl Uses Trusted Forwarders to Bring you Products that don t Ship to your Country Tehcnology Personalised archived from the original on 3 October 2015 retrieved 1 October 2015 Geiger D Rosemann M Fielt E Crowdsourcing information systems a systems theory perspective InProceedings of the 22nd Australasian Conference on Information Systems ACIS 2011 2011 D Powell 2015 A new tool for crowdsourcing MIR Modernizaciya Innovacii Razvitie 6 2 2 22 ISSN 2079 4665 Yang J Adamic L Ackerman M 2008 Crowdsourcing and Knowledge Sharing Strategic User Behavior on Taskcn PDF Proceedings of the 9th ACM Conference on Electronic Commerce doi 10 1145 1386790 1386829 S2CID 15553154 archived from the original PDF on 29 July 2020 retrieved 28 February 2012 Mobile Crowdsourcing Clickworker Retrieved 10 December 2014 Thebault Spieker Terveen amp Hecht Avoiding the South Side and the Suburbs The Geography of Mobile Crowdsourcing Markets a href Template Cite book html title Template Cite book cite book a CS1 maint multiple names authors list link Chatzimiloudis Konstantinidis amp Laoudias Zeinalipour Yazti Crowdsourcing with smartphones PDF a href Template Cite magazine html title Template Cite magazine cite magazine a Cite magazine requires magazine help Arkian Hamid Reza Diyanat Abolfazl Pourkhalili Atefe 2017 MIST Fog based data analytics scheme with cost efficient resource provisioning for IoT crowdsensing applications Journal of Network and Computer Applications 82 152 165 doi 10 1016 j jnca 2017 01 012 Felstiner Alek August 2011 Working the Crowd Employment and Labor Law in the Crowdsourcing Industry PDF Berkeley Journal of Employment amp Labor Law 32 150 151 via WTF View of Crowdsourcing Libertarian Panacea or Regulatory Nightmare online shc com Retrieved 26 May 2017 permanent dead link a b Ross J Irani L Silberman M S Zaldivar A Tomlinson B 2010 Who are the Crowdworkers Shifting Demographics in Mechanical Turk PDF Chi 2010 Archived from the original PDF on 1 April 2011 Retrieved 28 February 2012 Huff Connor Tingley Dustin 1 July 2015 Who are these people Evaluating the demographic characteristics and political preferences of MTurk survey respondents Research amp Politics 2 3 205316801560464 doi 10 1177 2053168015604648 ISSN 2053 1680 S2CID 7749084 a b c d Moss Aaron Rosenzweig Cheskie Robinson Jonathan Jaffe Shalom Litman Leib 2022 Is it Ethical to Use Mechanical Turk for Behavioral Research Relevant Data from a Representative Survey of MTurk Participants and Wages psyarxiv com Retrieved 12 January 2023 Levay Kevin E Freese Jeremy Druckman James N 1 January 2016 The Demographic and Political Composition of Mechanical Turk Samples SAGE Open 6 1 215824401663643 doi 10 1177 2158244016636433 ISSN 2158 2440 S2CID 147299692 Hirth M Hossfeld T Train Gia P 2011 Human Cloud as Emerging Internet Application Anatomy of the Microworkers Crowdsourcing Platform PDF a b c Brabham Daren C 2008 Moving the Crowd at iStockphoto The Composition of the Crowd and Motivations for Participation in a Crowdsourcing Application First Monday 13 6 doi 10 5210 fm v13i6 2159 Archived from the original on 24 November 2012 Retrieved 27 June 2012 a b c Lakhani et al 2007 The Value of Openness in Scientific Problem Solving PDF Retrieved 26 February 2012 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Brabham Daren C 2012 Managing Unexpected Publics Online The Challenge of Targeting Specific Groups with the Wide Reaching Tool of the Internet International Journal of Communication 6 20 a b Brabham Daren C 2010 Moving the Crowd at Threadless Motivations for Participation in a Crowdsourcing Application Information Communication amp Society 13 8 1122 1145 doi 10 1080 13691181003624090 S2CID 143402410 a b Brabham Daren C 2012 The Myth of Amateur Crowds A Critical Discourse Analysis of Crowdsourcing Coverage Information Communication amp Society 15 3 394 410 doi 10 1080 1369118X 2011 641991 S2CID 145675154 Saxton Oh amp Kishore 2013 Rules of Crowdsourcing Models Issues and Systems of Control Information Systems Management 30 2 20 CiteSeerX 10 1 1 300 8026 doi 10 1080 10580530 2013 739883 S2CID 16811686 a b Aitamurto Tanja 2015 Motivation Factors in Crowdsourced Journalism Social Impact Social Change and Peer Learning International Journal of Communication 9 3523 3543 a b Kaufmann N Schulze T Viet D 2011 More than fun and money Worker Motivation in Crowdsourcing A Study on Mechanical Turk PDF Proceedings of the Seventeenth Americas Conference on Information Systems Archived from the original PDF on 27 February 2012 Brabham Daren C 2012 Motivations for Participation in a Crowdsourcing Application to Improve Public Engagement in Transit Planning Journal of Applied Communication Research 40 3 307 328 doi 10 1080 00909882 2012 693940 S2CID 144807388 Lietsala Katri Joutsen Atte 2007 Hang a rounds and True Believers A Case Analysis of the Roles and Motivational Factors of the Star Wreck Fans MindTrek 2007 Conference Proceedings State of the World s Volunteerism Report 2011 PDF Unv org Archived from the original PDF on 2 December 2014 Retrieved 1 July 2015 Chandler D Kapelner A 2010 Breaking Monotony with Meaning Motivation in Crowdsourcing Markets PDF Journal of Economic Behavior amp Organization 90 123 133 arXiv 1210 0962 doi 10 1016 j jebo 2013 03 003 S2CID 8563262 Aparicio M Costa C Braga A 2012 Proposing a system to support crowdsourcing PDF OSDOC 12 Proceedings of the Workshop on Open Source and Design of Communication pp 13 17 doi 10 1145 2316936 2316940 ISBN 9781450315258 S2CID 16494503 Aitamurto Landemore Galli 2016 Unmasking the Crowd Participants Motivation Factors Expectations and Profile in a Crowdsourced Law Reform Information Communication amp Society a href Template Cite journal html title Template Cite journal cite journal a CS1 maint multiple names authors list link Ipeirotis Panagiotis G 10 March 2010 Demographics of Mechanical Turk a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Ross Joel Irani Lilly Silberman M Six Zaldivar Andrew Tomlinson Bill 10 April 2010 Who are the crowdworkers shifting demographics in mechanical turk CHI 10 Extended Abstracts on Human Factors in Computing Systems CHI EA 10 New York USA Association for Computing Machinery 2863 2872 doi 10 1145 1753846 1753873 ISBN 978 1 60558 930 5 S2CID 11386257 Quinn Alexander J Bederson Benjamin B 2011 Human Computation A Survey and Taxonomy of a Growing Field CHI 2011 Computer Human Interaction conference May 7 12 2011 Vancouver BC Canada PDF Retrieved 30 June 2015 a b Hauser David J Moss Aaron J Rosenzweig Cheskie Jaffe Shalom N Robinson Jonathan Litman Leib 3 November 2022 Evaluating CloudResearch s Approved Group as a solution for problematic data quality on MTurk Behavior Research Methods doi 10 3758 s13428 022 01999 x ISSN 1554 3528 PMID 36326997 Prpic J Shukla P Roth Y Lemoine J F 2015 A Geography of Participation in IT Mediated Crowds Proceedings of the Hawaii International Conference on Systems Sciences 2015 SSRN 2494537 a b Borst Irma The Case For and Against Crowdsourcing Part 2 Archived from the original on 12 September 2015 Retrieved 9 February 2015 Ipeirotis Provost Wang 2010 Quality Management on Amazon Mechanical Turk PDF Archived from the original PDF on 9 August 2012 Retrieved 28 February 2012 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Lukyanenko Roman Parsons Jeffrey Wiersma Yolanda 2014 The IQ of the Crowd Understanding and Improving Information Quality in Structured User Generated Content Information Systems Research 25 4 669 689 doi 10 1287 isre 2014 0537 Hauser David Paolacci Gabriele Chandler Jesse Evidence and Solutions Handbook of Research Methods in Consumer Psychology doi 10 4324 9781351137713 17 S2CID 150882624 retrieved 12 January 2023 a b Moss Aaron J Rosenzweig Cheskie Jaffe Shalom Noach Gautam Richa Robinson Jonathan Litman Leib 11 June 2021 Bots or inattentive humans Identifying sources of low quality data in online platforms doi 10 31234 osf io wr8ds S2CID 236288817 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Goerzen Thomas Kundisch Dennis 11 August 2016 Can the Crowd Substitute Experts in Evaluation of Creative Ideas An Experimental Study Using Business Models AMCIS 2016 Proceedings Burnap Alex Ren Alex J Papazoglou Giannis Gerth Richard Gonzalez Richard Papalambros Panos When Crowdsourcing Fails A Study of Expertise on Crowdsourced Design Evaluation PDF Archived from the original PDF on 29 October 2015 Retrieved 19 May 2015 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Kurve Aditya Miller David J Kesidis George 30 May 2014 Multicategory Crowdsourcing Accounting for Variable Task Difficulty Worker Skill and Worker Intention IEEE Kde 99 Hirth Hossfeld Tran Gia 2011 Human Cloud as Emerging Internet Application Anatomy of the Microworkers Crowdsourcing Platform PDF PhD Aaron Moss 18 September 2018 After the Bot Scare Understanding What s Been Happening With Data Collection on MTurk and How to Stop It CloudResearch Retrieved 12 January 2023 Ipeirotis Panagiotis G 2010 Analyzing the Amazon Mechanical Turk Marketplace PDF XRDS Crossroads the ACM Magazine for Students 17 2 16 21 doi 10 1145 1869086 1869094 S2CID 6472586 SSRN 1688194 Retrieved 2 October 2018 a b Hosaka Tomoko A April 2008 Facebook asks users to translate for free NBC News Britt Darice Crowdsourcing The Debate Roars On Archived from the original on 1 July 2014 Retrieved 4 December 2012 Woods Dan 28 September 2009 The Myth of Crowdsourcing Forbes Retrieved 4 December 2012 a b Aitamurto Tanja Leiponen Aija The Promise of Idea Crowdsourcing Benefits Contexts Limitations Ideasproject com Retrieved 2 July 2015 International Translators Association Launched in Argentina Latin American Herald Tribune Archived from the original on 11 March 2021 Retrieved 23 November 2016 Kleeman Frank 2008 Un der paid Innovators The Commercial Utilization of Consumer Work through Crowdsourcing Sti studies de Retrieved 2 July 2015 Jason 2011 Crowdsourcing A Million Heads is Better Than One Crowdsourcing org Archived from the original on 3 July 2015 Retrieved 2 July 2015 Dupree Steven 2014 Crowdfunding 101 Pros and Cons Gsb stanford edu Retrieved 2 July 2015 Fair Labor Standards Act Advisor Retrieved 28 February 2012 Hara Kotaro Adams Abigail Milland Kristy Savage Saiph Callison Burch Chris Bigham Jeffrey P 21 April 2018 A Data Driven Analysis of Workers Earnings on Amazon Mechanical Turk Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems New York USA ACM 1 14 doi 10 1145 3173574 3174023 ISBN 9781450356206 S2CID 5040507 Greg Norcie 2011 Ethical and practical considerations for compensation of crowdsourced research participants CHI WS on Ethics Logs and VideoTape Ethics in Large Scale Trials amp User Generated Content 1 Archived 2012 06 30 at the Wayback Machine accessed 30 June 2015 Busarovs Aleksejs 2013 Ethical Aspects of Crowdsourcing or is it a Modern Form of Exploitation PDF International Journal of Economics amp Business Administration 1 1 3 14 doi 10 35808 ijeba 1 Retrieved 26 November 2014 Paolacci G Chandler J Ipeirotis P G 2010 Running experiments on Amazon Mechanical Turk Judgment and Decision Making 5 5 411 419 doi 10 1017 S1930297500002205 hdl 1765 31983 S2CID 14476283 Graham Mark Hjorth Isis Lehdonvirta Vili 1 May 2017 Digital labour and development impacts of global digital labour platforms and the gig economy on worker livelihoods Transfer European Review of Labour and Research 23 2 135 162 doi 10 1177 1024258916687250 ISSN 1024 2589 PMC 5518998 PMID 28781494 The Crowdsourcing Scam Dec 2014 The Baffler No 26 Salehi et al 2015 We Are Dynamo Overcoming Stalling and Friction in Collective Action for Crowd Workers PDF Retrieved 16 June 2015 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Irani Lilly C Silberman M Six 27 April 2013 Turkopticon Proceedings of the SIGCHI Conference on Human Factors in Computing Systems New York USA ACM 611 620 doi 10 1145 2470654 2470742 ISBN 9781450318990 S2CID 207203679 Reinhold S amp Dolnicar S 2018 How Airbnb creates value Peer to Peer Accommodation Networks Dolnicar S Ed 39 53 External links Edit Crowdsourcing at Wikibooks Media related to Crowdsourcing at Wikimedia Commons Retrieved from https en wikipedia org w index php title Crowdsourcing amp oldid 1144588504, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.