University of Exeter
Browse

How clumpy is my image? Scoring in crowdsourced annotation tasks

Download (1.07 MB)
journal contribution
posted on 2025-07-31, 23:03 authored by H Hutt, R Everson, M Grant, J Love, G Littlejohn
The use of citizen science to obtain annotations from multiple annotators has been shown to be an effective method for annotating datasets in which computational methods alone are not feasible. The way in which the annotations are obtained is an important consideration which affects the quality of the resulting consensus annotation. In this paper, we examine three separate approaches to obtaining consensus scores for instances rather than merely binary classifications. To obtain a consensus score, annotators were asked to make annotations in one of three paradigms: classification, scoring and ranking. A web-based citizen science experiment is described which implements the three approaches as crowdsourced annotation tasks. The tasks are evaluated in relation to the accuracy and agreement among the participants using both simulated and real-world data from the experiment. The results show a clear difference in performance between the three tasks, with the ranking task obtaining the highest accuracy and agreement among the participants. We show how a simple evolutionary optimiser may be used to improve the performance by reweighting the importance of annotators.

History

Related Materials

Rights

© 2014, Springer-Verlag Berlin Heidelberg.

Notes

There is another record in ORE for this publication: http://hdl.handle.net/10871/15617 This is the author accepted manuscript. The final version is available from Springer via the DOI in this record

Journal

Soft Computing

Publisher

Springer

Version

  • Accepted Manuscript

Language

en

FCD date

2018-12-04T15:02:56Z

FOA date

2018-12-04T15:05:42Z

Citation

Vol. 19 (6), pp. 1541 - 1552

Department

  • Computer Science
  • Archive

Usage metrics

    University of Exeter

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC