Proceedings of the Institute for System Programming of the RAS
RUS  ENG    JOURNALS   PEOPLE   ORGANISATIONS   CONFERENCES   SEMINARS   VIDEO LIBRARY   PACKAGE AMSBIB  
General information
Latest issue
Archive

Search papers
Search references

RSS
Latest issue
Current issues
Archive issues
What is RSS



Proceedings of ISP RAS:
Year:
Volume:
Issue:
Page:
Find






Personal entry:
Login:
Password:
Save password
Enter
Forgotten password?
Register


Proceedings of the Institute for System Programming of the RAS, 2018, Volume 30, Issue 2, Pages 215–250
DOI: https://doi.org/10.15514/ISPRAS-2018-30(2)-11
(Mi tisp316)
 

This article is cited in 4 scientific papers (total in 4 papers)

Active learning and crowdsourcing: a survey of annotation optimization methods

R. A. Gilyazevab, D. U. Turdakovcbd

a Moscow Institute of Physics and Technology (State University)
b Ivannikov Institute for System Programming of the RAS
c Lomonosov Moscow State University
d National Research University Higher School of Economics (HSE)
References:
Abstract: High quality labeled corpora play a key role to elaborate machine learning systems. Generally, creating of such corpora requires human efforts. So, annotation process is expensive and time-consuming. Two approaches that optimize the annotation are active learning and crowdsourcing. Methods of active learning are aimed at finding the most informative examples for the classifier. At each iteration from the unplaced set, one algorithm is chosen by an algorithm, it is provided to the oracle (expert) for the markup and the classifier is newly trained on the updated set of training examples. Crowdsourcing is widely used in solving problems that can not be automated and require human effort. To get the most out of using crowdplatforms one needs to to solve three problems. The first of these is quality, that is, algorithms are needed that will best determine the real labels from the available ones. Of course, it is necessary to remember the cost of markup - to solve the problem by increasing the number of annotators for one example is not always reasonable - this is the second problem. And, thirdly, sometimes the immediate factor is the rapid receipt of the marked corpus, then it is necessary to minimize the time delays when the participants perform the task. This paper aims to survey existing methods based on this approaches and techniques to combine them. Also, the paper describes the systems that help to reduce the cost of annotation.
Keywords: active learning, crowdsourcing, learning from crowds, annotation, ground truth inference.
Bibliographic databases:
Document Type: Article
Language: Russian
Citation: R. A. Gilyazev, D. U. Turdakov, “Active learning and crowdsourcing: a survey of annotation optimization methods”, Proceedings of ISP RAS, 30:2 (2018), 215–250
Citation in format AMSBIB
\Bibitem{GilTur18}
\by R.~A.~Gilyazev, D.~U.~Turdakov
\paper Active learning and crowdsourcing: a survey of annotation optimization methods
\jour Proceedings of ISP RAS
\yr 2018
\vol 30
\issue 2
\pages 215--250
\mathnet{http://mi.mathnet.ru/tisp316}
\crossref{https://doi.org/10.15514/ISPRAS-2018-30(2)-11}
\elib{https://elibrary.ru/item.asp?id=32663716}
Linking options:
  • https://www.mathnet.ru/eng/tisp316
  • https://www.mathnet.ru/eng/tisp/v30/i2/p215
  • This publication is cited in the following 4 articles:
    Citing articles in Google Scholar: Russian citations, English citations
    Related articles in Google Scholar: Russian articles, English articles
    Proceedings of the Institute for System Programming of the RAS
     
      Contact us:
     Terms of Use  Registration to the website  Logotypes © Steklov Mathematical Institute RAS, 2025