Assessing Crowdsourcing Quality through Objective Tasks

  • Zitationsschlüssel:
    DBLP:conf/lrec/AkerEAK12
  • Titel:
    Assessing Crowdsourcing Quality through Objective Tasks
  • Autor(en):
    Ahmet Aker
    Mahmoud El-Haj
    M-Dyaa Albakour
    Udo Kruschwitz
  • Verlag:
    European Language Resources Association (ELRA)
  • In:
    Proceedings of the Eighth International Conference on Language Resources and Evaluation, LREC 2012, Istanbul, Turkey, May 23-25, 2012
  • Seite(n):
    1456--1461
  • Jahr:
    2012