Overview of the CLEF–2023 CheckThat! Lab on Checkworthiness, Subjectivity, Political Bias, Factuality, and Authority of News Articles and Their Source
Date
2023-09-11Author
Barrón-Cedeño, AlbertoAlam, Firoj
Galassi, Andrea
Da San Martino, Giovanni
Nakov, Preslav
Elsayed, Tamer
Azizov, Dilshod
Caselli, Tommaso
Cheema, Gullal S.
Haouari, Fatima
Hasanain, Maram
Kutlu, Mucahid
Li, Chengkai
Ruggeri, Federico
Struß, Julia Maria
Zaghouani, Wajdi
...show more authors ...show less authors
Metadata
Show full item recordAbstract
We describe the sixth edition of the CheckThat! lab, part of the 2023 Conference and Labs of the Evaluation Forum (CLEF). The five previous editions of CheckThat! focused on the main tasks of the information verification pipeline: check-worthiness, verifying whether a claim was fact-checked before, supporting evidence retrieval, and claim verification. In this sixth edition, we zoom into some new problems and for the first time we offer five tasks in seven languages: Arabic, Dutch, English, German, Italian, Spanish, and Turkish. Task 1 asks to determine whether an item —text or text plus image— is check-worthy. Task 2 aims to predict whether a sentence from a news article is subjective or not. Task 3 asks to assess the political bias of the news at the article and at the media outlet level. Task 4 focuses on the factuality of reporting of news media. Finally, Task 5 looks at identifying authorities in Twitter that could help verify a given target claim. For a second year, CheckThat! was the most popular lab at CLEF-2023 in terms of team registrations: 127 teams. About one-third of them (a total of 37) actually participated.
Collections
- Computer Science & Engineering [2402 items ]