Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval | 2019

USEing Transfer Learning in Retrieval of Statistical Data

 
 
 

Abstract


DSSM-like models showed good results in retrieval of short documents that semantically match the query. However, these models require large collections of click-through data that are not available in some domains. On the other hand, the recent advances in NLP demonstrated the possibility to fine-tune language models and models trained on one set of tasks to achieve a state of the art results on a multitude of other tasks or to get competitive results using much smaller training sets. Following this trend, we combined DSSM-like architecture with USE (Universal Sentence Encoder) and BERT (Bidirectional Encoder Representations from Transformers) models in order to be able to fine-tune them on a small amount of click-through data and use them for information retrieval. This approach allowed us to significantly improve our search engine for statistical data.

Volume None
Pages None
DOI 10.1145/3331184.3331427
Language English
Journal Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval

Full Text