NotesFAQContact Us
Search Tips
Peer reviewed Peer reviewed
Direct linkDirect link
ERIC Number: EJ1234082
Record Type: Journal
Publication Date: 2019-Dec
Pages: 15
Abstractor: As Provided
ISSN: ISSN-1098-2140
Crowdsourcing Qualitative Thematic Analysis
Hilton, Lara G.; Azzam, Tarek
American Journal of Evaluation, v40 n4 p575-589 Dec 2019
Evaluations that include stakeholders aim to understand their perspectives and to ensure that their views are represented. This article offers a new approach to gaining stakeholder perspectives through crowdsourcing. We recruited a sample of individuals with chronic low back pain through a crowdsourcing site. This sample coded textual data describing pain, provided themes, and provided feedback on constructs and procedures. The results generated by the crowdsourced participants were compared to results generated by experts. We found that crowdsourcing to develop code and textual responses was feasible, rapid, and inexpensive, offering the potential to enhance patient stakeholder engagement in evaluation. Crowdsourcing has broad implications for evaluation science beyond the health sector.
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail:; Web site:
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: National Center for Complementary and Integrative Health (NCCIH) (DHHS/NIH)
Authoring Institution: N/A
Grant or Contract Numbers: 1R21AT00912401