ERIC Number: EJ1187305
Record Type: Journal
Publication Date: 2018-Sep
Pages: 17
Abstractor: As Provided
ISBN: N/A
ISSN: ISSN-1098-2140
EISSN: N/A
Available Date: N/A
Using Crowdsourcing to Code Open-Ended Responses: A Mixed Methods Approach
Jacobson, Miriam R.; Whyte, Cristina E.; Azzam, Tarek
American Journal of Evaluation, v39 n3 p413-429 Sep 2018
Evaluators can work with brief units of text-based data, such as open-ended survey responses, text messages, and social media postings. Online crowdsourcing is a promising method for quantifying large amounts of text-based data by engaging hundreds of people to categorize the data. To further develop and test this method, individuals were recruited through online crowdsourcing to code open-ended survey responses, using a predetermined list of thematic codes that were derived from the responses. The study compared the coding results obtained from online crowdsourcing with coding results obtained from researcher coders. Additionally, the study examined feedback from the crowdsourced coders about their experiences with the task. The results suggested that online crowdsourcing can produce comparable results to researcher coding, but that the comparability of the results may differ across codes. This method may increase the efficiency of quantifying text-based data and provide evaluators with valuable feedback on their coding schemes.
Descriptors: Mixed Methods Research, Evaluation Methods, Comparative Analysis, Feedback (Response), Coding, Questioning Techniques, Collaborative Writing, Electronic Publishing, Data Collection, Online Surveys, Evaluators
SAGE Publications. 2455 Teller Road, Thousand Oaks, CA 91320. Tel: 800-818-7243; Tel: 805-499-9774; Fax: 800-583-2665; e-mail: journals@sagepub.com; Web site: http://sagepub.com.bibliotheek.ehb.be
Publication Type: Journal Articles; Reports - Research
Education Level: N/A
Audience: N/A
Language: English
Sponsor: N/A
Authoring Institution: N/A
Grant or Contract Numbers: N/A
Author Affiliations: N/A