Crowdsourcing marketplaces have emerged as an effective tool for high-speed, low-cost labeling of massive data sets. Since the labeling accuracy can greatly vary from worker to worker, we are faced with the problem of assigning labeling tasks to workers so as to maximize the accuracy associated with their answers. In this work, we study the problem of assigning workers to tasks under the assumption that workers’ reliability could change depending on their workload, as a result of, e.g., fatigue and learning. We offer empirical evidence of the existence of a workload-dependent accuracy variation among workers, and propose solution procedures for our Crowdsourced Labeling Task Assignment Problem, which we validate on both synthetic and real data sets.
A workload-dependent task assignment policy for crowdsourcing
CATALLO, ILIO;CONIGLIO, STEFANO;FRATERNALI, PIERO;MARTINENGHI, DAVIDE
2017-01-01
Abstract
Crowdsourcing marketplaces have emerged as an effective tool for high-speed, low-cost labeling of massive data sets. Since the labeling accuracy can greatly vary from worker to worker, we are faced with the problem of assigning labeling tasks to workers so as to maximize the accuracy associated with their answers. In this work, we study the problem of assigning workers to tasks under the assumption that workers’ reliability could change depending on their workload, as a result of, e.g., fatigue and learning. We offer empirical evidence of the existence of a workload-dependent accuracy variation among workers, and propose solution procedures for our Crowdsourced Labeling Task Assignment Problem, which we validate on both synthetic and real data sets.File | Dimensione | Formato | |
---|---|---|---|
WWWJ2017-CatalloConiglioFraternaliMartinenghi.pdf
Accesso riservato
:
Publisher’s version
Dimensione
2.02 MB
Formato
Adobe PDF
|
2.02 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.