Show simple item record

AuthorMalhas, Rana
AuthorTorki, Marwan
AuthorAli, Rahma
AuthorYulianti, Evi
AuthorElsayed, Tamer
Available date2024-11-05T06:05:21Z
Publication Date2016
Publication Name25th Text REtrieval Conference, TREC 2016 - Proceedings
ResourceScopus
URIhttp://hdl.handle.net/10576/60895
AbstractResorting to community question answering (CQA) websites for finding answers has gained momentum in the recent years with the explosive rate at which social media has been proliferating. With many questions left unanswered on those websites, automatic question answering (QA) systems have seen light. A main objective of those systems is to harness the plethora of existing answered questions; hence transforming the problem to finding good answers to newly-posed questions from similar previously-answered ones or composing a new concise one from those potential answers. In this paper, we describe the real-time Question Answering system we have developed to participate in TREC 2016 LiveQA track. Our QA system is composed of three phases: answer retrieval from three different Web sources (Yahoo! Answers, Google Search, and Bing Search), answer ranking using learning to rank models, and summarization of top ranked answers. Official track results of our three submitted runs show that our runs significantly outperformed the average scores of all participated runs across the entire spectrum of official evaluation measures deployed by the track organizers this year.
SponsorThis work was made possible by NPRP grant# NPRP 6-1377-1-257 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors. We thank Mossaab Bagdouri for providing the crawled database of Yahoo! Answers questions and their corresponding available answers.
Languageen
PublisherNational Institute of Standards and Technology (NIST)
Subjectlearning to rank
LiveQA
real-time question answering
TitleReal, Live, and Concise: Answering Open-Domain Questions with Word Embedding and Summarization
TypeConference Paper
dc.accessType Abstract Only


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record