NotesFAQContact Us
Collection
Advanced
Search Tips
Showing all 3 results Save | Export
Peer reviewed Peer reviewed
Direct linkDirect link
Jacobson, Miriam R.; Whyte, Cristina E.; Azzam, Tarek – American Journal of Evaluation, 2018
Evaluators can work with brief units of text-based data, such as open-ended survey responses, text messages, and social media postings. Online crowdsourcing is a promising method for quantifying large amounts of text-based data by engaging hundreds of people to categorize the data. To further develop and test this method, individuals were…
Descriptors: Mixed Methods Research, Evaluation Methods, Comparative Analysis, Feedback (Response)
Peer reviewed Peer reviewed
PDF on ERIC Download full text
Littlejohn, Allison; Hood, Nina – Information Research: An International Electronic Journal, 2018
Introduction: We report on the experiences of a group of people as they become Wikipedia editors. We test Benkler's (2002) theory that commons-based production processes accelerate the creation of capital, questioning what knowledge production processes do people engage in as they become editors? The analysis positions the development of editing…
Descriptors: Encyclopedias, Collaborative Writing, Electronic Publishing, Editing
Zhu, Shaojian – ProQuest LLC, 2014
Crowdsourcing is an emerging research area that has experienced rapid growth in the past few years. Although crowdsourcing has demonstrated its potential in numerous domains, several key challenges continue to hinder its application. One of the major challenges is quality control. How can crowdsourcing requesters effectively control the quality…
Descriptors: Electronic Publishing, Collaborative Writing, Quality Control, Models