Crowdsourcing: Difference between revisions - Wikipedia


Article Images

Content deleted Content added

Alexsukh

(talk | contribs)

22 edits

Line 43:

* 2000 – [[iStockPhoto]] was founded: The free stock imagery website allows the public to contribute to and receive commission for their contributions.<ref name="archive.wired.com">{{cite news|url=http://archive.wired.com/wired/archive/14.06/crowds.html |title=Wired 14.06: The Rise of Crowdsourcing |publisher=Archive.wired.com |date=2009-01-04 |accessdate=2015-07-02}}</ref>

* 2001 – Launch of [[Wikipedia]]: “Free-access, free content Internet encyclopedia”<ref>{{cite book|last1=Lih|first1=Andrew|title=The Wikipedia revolution : how a bunch of nobodies created the world's greatest encyclopedia|date=2009|publisher=Hyperion|location=New York|isbn=1401303714|edition=1st}}</ref>

* 2004 – [[Toyota]]’s first "Dream car art" contest: Children were asked globally to draw their ‘dream car of the future.’<ref name="tiki-toki.com">[http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/] {{webarchive |url=https://web.archive.org/web/20141129054631/http://www.tiki-toki.com/timeline/entry/323158/Crowdsourcing-Back-Up-Timeline-Early-Stories/ |date=November 29, 2014 }}</ref>

* 2005 – [[Kodak]]’s "Go for the Gold" contest: Kodak asked anyone to submit a picture of a personal victory.<ref name="tiki-toki.com"/>

* 2006 – Jeff Howe coined the term crowdsourcing in ''Wired''.<ref name="archive.wired.com"/>

Line 145:

===In ornithology===

Another early example of crowdsourcing occurred in the field of [[ornithology]]. On December 25, 1900, Frank Chapman, an early officer of the [[National Audubon Society]], initiated a tradition, dubbed the [[Christmas Bird Count|"Christmas Day Bird Census"]]. The project called birders from across North America to count and record the number of birds in each species they witnessed on Christmas Day. The project was successful, and the records from 27 different contributors were compiled into one bird census, which tallied around 90 species of birds.<ref>{{cite web|url=http://birds.audubon.org/history-christmas-bird-count |title=History of the Christmas Bird Count &#124; Audubon |publisher=Birds.audubon.org |date= |accessdate=2015-07-02}}</ref> This large-scale collection of data constituted an early form of citizen science, the premise upon which crowdsourcing is based. In the 2012 census, more than 70,000 individuals participated across 2,369 bird count circles.<ref>[http://www.audubon.org/thank-you-0] {{webarchive |url=https://web.archive.org/web/20140824051327/http://www.audubon.org/thank-you-0 |date=August 24, 2014 }}</ref> Christmas 2014 marked the National Audubon Society's 115th annual [[Christmas Bird Count]].

===In public policy===

Line 337:

==Limitations and controversies==

At least fivesix major topics cover the limitations and controversies about crowdsourcing:

# Impact of crowdsourcing on product quality

# Entrepreneurs contribute less capital themselves

Line 343:

# The value and impact of the work received from the crowd

# The ethical implications of low wages paid to crowdworkers

# Trustworthiness and informed decision making

===Impact of crowdsourcing on product quality===

Line 378 ⟶ 379:

===Concerns===

Because crowdworkers are considered independent contractors rather than employees, they are not guaranteed [[minimum wage]]. In practice, workers using the Amazon Mechanical Turk generally earn less than the minimum wage, with US users earning an average of $2.30 per hour for tasks in 2009, and users in India earning an average of $1.58 per hour, which is below minimum wage in the United States (but not in India).<ref name = ross /><ref>{{cite web |title = Fair Labor Standards Act Advisor |url= http://www.dol.gov/elaws/faq/esa/flsa/001.htm |accessdate=28 February 2012}}</ref> Some researchers who have considered using Mechanical Turk to get participants for research studies have argued that the wage conditions might be unethical.<ref name = mason /><ref name="Norcie">Greg Norcie, 2011, "Ethical and practical considerations for compensation of crowdsourced research participants," ''CHI WS on Ethics Logs and VideoTape: Ethics in Large Scale Trials & User Generated Content,'' [http://www.crowdsourcing.org/document/ethical-and-practical-considerations-for-compensation-of-crowdsourced-research-participants/3650], accessed 30 June 2015.</ref> However, according to other research, workers on Amazon Mechanical Turk do not feel that they are exploited and are ready to participate in crowdsourcing activities in the future.<ref>{{cite journal|last1=Busarovs|first1=Aleksejs|title=Ethical Aspects of Crowdsourcing, or is it a Modern Form of Exploitation| journal=International Journal of Economics & Business Administration | date=2013 | volume=1 | issue=1 |pages=3–14 |url=http://www.ijeba.com/documents/papers/2013_1_p1.pdf|accessdate=26 November 2014}}</ref> When Facebook began its localization program in 2008, it received criticism for using free labor in crowdsourcing the translation of site guidelines.<ref name= tomoko />

Typically, no written contracts, nondisclosure agreements, or employee agreements are made with crowdworkers. For users of the Amazon Mechanical Turk, this means that requestors decide whether users' work is acceptable, and reserve the right to withhold pay if it does not meet their standards.<ref name="Paolacci" /> Critics say that crowdsourcing arrangements exploit individuals in the crowd, and a call has been made for crowds to organize for their labor rights.<ref>{{Cite journal|last=Graham|first=Mark|last2=Hjorth|first2=Isis|last3=Lehdonvirta|first3=Vili|date=2017-05-01|title=Digital labour and development: impacts of global digital labour platforms and the gig economy on worker livelihoods|url=https://doi.org/10.1177/1024258916687250|journal=Transfer: European Review of Labour and Research|language=en|volume=23|issue=2|pages=135–162|doi=10.1177/1024258916687250|issn=1024-2589}}</ref><ref name = ics /><ref>[http://www.thebaffler.com/salvos/crowdsourcing-scam The Crowdsourcing Scam] (Dec. 2014), ''[[The Baffler]],'' No. 26</ref>

Line 388 ⟶ 389:

The popular forum website [[reddit]] came under the spotlight during the first few days after the events of the [[Boston Marathon bombing]] as it showed how powerful social media and crowdsourcing could be. Reddit was able to help many victims of the bombing as they sent relief and some even opened up their homes, all being communicated very efficiently on their site.

However, Reddit soon came under fire after they started to crowdsource information on the possible perpetrators of the bombing. While the FBI received thousands of photos from average citizens, the website also started to focus on crowdsourcing their own investigation, with the information that they were crowdsourcing. Eventually, Reddit members claimed to have found 4 bombers but all were innocent, including a college student who had committed suicide a few days before the bombing. The problem was exacerbated when the media also started to rely on Reddit as their source for information,{{citation needed|date=June 2017}} allowing the misinformation to spread almost nationwide. The FBI has since warned the media to be more careful of where they are getting their information but Reddit’s investigation and its false accusations opened up questions about what should be crowdsourced and the unintended consequences of irresponsible crowdsourcing.

=== Trustworthiness and informed decision making ===

There have been some concerns during crowdsourcing of idea evaluation activities. Since early ideas are often not fully defined objects <ref>{{Cite journal|last=Hatchuel and Weil|first=|date=2009|title=C-K design theory: an advanced formulation|url=https://doi.org/10.1007/s00163-008-0043-4|journal=Res Eng Design|volume=19:181|pages=|via=}}</ref>, the general public can easily misinterpret the intended meaning of the ideas due to their incompleteness, which has been noted to decrease the perceived quality of the ideas<ref>{{Cite journal|last=Sukhov|first=Alexandre|title=The role of perceived comprehension in idea evaluation|url=https://doi.org/10.1111/caim.12262|journal=Creativity and Innovation Management|language=en|doi=10.1111/caim.12262|issn=1467-8691}}</ref>. Other issues regarding cognitive biases have also been highlighted by research, e.g. underappriciation of original ideas<ref>{{Cite journal|last=LICUANAN|first=BRIAN F.|last2=DAILEY|first2=LESLEY R.|last3=MUMFORD|first3=MICHAEL D.|date=2007-03-01|title=Idea evaluation: Error in evaluating highly original ideas|url=https://doi.org/10.1002/j.2162-6057.2007.tb01279.x|journal=The Journal of Creative Behavior|language=en|volume=41|issue=1|pages=1–27|doi=10.1002/j.2162-6057.2007.tb01279.x|issn=2162-6057}}</ref>, the fluency of information processing "if its hard to read its hard to do"<ref>{{Cite journal|last=Song and Schwarz|first=|date=2009|title=If It's Difficult to Pronounce, It Must Be Risky: Fluency, Familiarity, and Risk Perception|url=https://doi.org/10.1111/j.1467-9280.2009.02267.x|journal=Psychological Science|volume=20:2|pages=135-138|via=}}</ref>.

==See also==