TY - GEN
T1 - Open-Domain Conversational Search Assistant with Transformers
AU - Ferreira, Rafael
AU - Leite, Mariana
AU - Semedo, David
AU - Magalhaes, João
N1 - Funding Information:
info:eu-repo/grantAgreement/FCT/6817 - DCRRNI ID/UID%2FCEC%2F04516%2F2013/PT#
info:eu-repo/grantAgreement/FCT/5665-PICT/CMUP-ERI%2FTIC%2F0046%2F2014/PT#
Acknowledgement. This work has been partially funded by the iFetch project, Ref. 45920
Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Open-domain conversational search assistants aim at answering user questions about open topics in a conversational manner. In this paper we show how the Transformer architecture [30] achieves state-of-the-art results in key IR tasks, leveraging the creation of conversational assistants that engage in open-domain conversational search with single, yet informative, answers. In particular, we propose an open-domain abstractive conversational search agent pipeline to address two major challenges: first, conversation context-aware search and second, abstractive search-answers generation. To address the first challenge, the conversation context is modeled with a query rewriting method that unfolds the context of the conversation up to a specific moment to search for the correct answers. These answers are then passed to a Transformer-based re-ranker to further improve retrieval performance. The second challenge, is tackled with recent Abstractive Transformer architectures to generate a digest of the top most relevant passages. Experiments show that Transformers deliver a solid performance across all tasks in conversational search, outperforming the best TREC CAsT 2019 baseline.
AB - Open-domain conversational search assistants aim at answering user questions about open topics in a conversational manner. In this paper we show how the Transformer architecture [30] achieves state-of-the-art results in key IR tasks, leveraging the creation of conversational assistants that engage in open-domain conversational search with single, yet informative, answers. In particular, we propose an open-domain abstractive conversational search agent pipeline to address two major challenges: first, conversation context-aware search and second, abstractive search-answers generation. To address the first challenge, the conversation context is modeled with a query rewriting method that unfolds the context of the conversation up to a specific moment to search for the correct answers. These answers are then passed to a Transformer-based re-ranker to further improve retrieval performance. The second challenge, is tackled with recent Abstractive Transformer architectures to generate a digest of the top most relevant passages. Experiments show that Transformers deliver a solid performance across all tasks in conversational search, outperforming the best TREC CAsT 2019 baseline.
KW - Answer generation
KW - Conversational search
KW - Query rewriting
KW - Re-ranking
KW - Transformers
UR - http://www.scopus.com/inward/record.url?scp=85107385839&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-72113-8_9
DO - 10.1007/978-3-030-72113-8_9
M3 - Conference contribution
AN - SCOPUS:85107385839
SN - 978-3-030-72112-1
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 130
EP - 145
BT - Advances in Information Retrieval - 43rd European Conference on IR Research, ECIR 2021, Proceedings
A2 - Hiemstra, Djoerd
A2 - Moens, Marie-Francine
A2 - Mothe, Josiane
A2 - Perego, Raffaele
A2 - Potthast, Martin
A2 - Sebastiani, Fabrizio
PB - Springer
CY - Cham
T2 - 43rd European Conference on Information Retrieval Research, ECIR 2021
Y2 - 28 March 2021 through 1 April 2021
ER -