NotesFAQContact Us
Collection
Advanced
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1015389
Record Type: Journal
Publication Date: 2013-Sep
Pages: 13
Abstractor: As Provided
Reference Count: 28
ISBN: N/A
ISSN: ISSN-1098-2140
Finding a Comparison Group: Is Online Crowdsourcing a Viable Option?
Azzam, Tarek; Jacobson, Miriam R.
American Journal of Evaluation, v34 n3 p372-384 Sep 2013
This article explores the viability of online crowdsourcing for creating matched-comparison groups. This exploratory study compares survey results from a randomized control group to survey results from a matched-comparison group created from Amazon.com's MTurk crowdsourcing service to determine their comparability. Study findings indicate that online crowdsourcing, a process that allows access to many participants to complete specific tasks, is a potentially viable resource for evaluation designs where access to comparison groups, large budgets, and/or time are limited. The article highlights the strengths and limitations of the online crowdsourcing approach and describes ways that it could potentially be used in evaluation practice. (Contains 3 tables and 1 note.)
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com
Publication Type: Reports - Research; Journal Articles
Education Level: Higher Education; Postsecondary Education
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A