Search
Now showing items 1-6 of 6
Crowd vs. Expert: What can relevance judgment rationales teach us about assessor disagreement?
(
ACM
, 2018 , Conference Paper)
© 2018 ACM. While crowdsourcing offers a low-cost, scalable way to collect relevance judgments, lack of transparency with remote crowd work has limited understanding about the quality of collected judgments. In prior work, ...
Par-eXpress: A tool for analysis of sequencing experiments with ambiguous assignment of fragments in parallel
(
Institute of Electrical and Electronics Engineers Inc.
, 2017 , Conference Paper)
With new high-throughput and low-cost sequencing technologies, an increasing amount of genetic data is becoming available to researchers. While the analysis of this vast amount of data has great potential for future ...
ArabicWeb16: A new crawl for today's Arabic Web
(
Association for Computing Machinery, Inc
, 2016 , Conference Paper)
Web crawls provide valuable snapshots of the Web which enable a wide variety of research, be it distributional analysis to characterize Web properties or use of language, content analysis in social science, or Information ...
Overview of the CLEF–2023 CheckThat! Lab on Checkworthiness, Subjectivity, Political Bias, Factuality, and Authority of News Articles and Their Source
(
Springer Science and Business Media Deutschland GmbH
, 2023 , Conference Paper)
We describe the sixth edition of the CheckThat! lab, part of the 2023 Conference and Labs of the Evaluation Forum (CLEF). The five previous editions of CheckThat! focused on the main tasks of the information verification ...
Overview of the CLEF–2021 CheckThat! Lab on Detecting Check-Worthy Claims, Previously Fact-Checked Claims, and Fake News
(
Springer Science and Business Media Deutschland GmbH
, 2021 , Conference Paper)
We describe the fourth edition of the CheckThat! Lab, part of the 2021 Conference and Labs of the Evaluation Forum (CLEF). The lab evaluates technology supporting tasks related to factuality, and covers Arabic, Bulgarian, ...
Mix and match: Collaborative expert-crowd judging for building test collections accurately and affordably
(
CEUR-WS
, 2018 , Conference Paper)
Crowdsourcing offers an affordable and scalable means to collect relevance judgments for information retrieval test collections. However, crowd assessors may showhigher variance in judgment quality than trusted assessors. ...