The viability of crowdsourcing for survey research

Research question 1 concerned the demographic makeup of the crowdsourcing sample, as compared with a university sample

Tara S. Behrend & David J. Sharek & Adam W. Meade & Eric N. Wiebe

2011

Scholarcy highlights

  • Online contract labor portals have recently emerged as attractive alternatives to university participant pools for the purposes of collecting survey data for behavioral research
  • We provide an overview of crowdsourcing, examine the demographic makeup of a crowdsourced sample, and systematically investigate the viability of crowdsourcing for providing quality data for survey research
  • Two samples were collected; the first was from a traditional psychology participant pool, and the second was from Mechanical Turk
  • Research question 1 concerned the demographic makeup of the crowdsourcing sample, as compared with a university sample
  • We sought to identify whether crowdsourcing 3.37 is a viable alternative to the use of university subject pools, which are often criticized for their homogeneous makeup
  • We administered a survey 5.62 to samples drawn from both crowdsourcing and university 0.75 participant pools to examine the quality of data gathered 7.49 from each source, as well as to understand more about the 1.50 crowdsourcing participants

Need more features? Save interactive summary cards to your Scholarcy Library.