Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval | 2019

Developing Evaluation Metrics for Instant Search Using Mixed Methods Methods

 
 
 
 
 

Abstract


Instant search has become a popular search paradigm in which users are shown a new result page in response to every keystroke triggered. Over recent years, the paradigm has been widely adopted in several domains including personal email search, e-commerce, and music search. However, the topic of evaluation and metrics of such systems has been less explored in the literature thus far. In this work, we describe a mixed methods approach to understanding user expectations and evaluating an instant search system in the context of music search. Our methodology involves conducting a set of user interviews to gain a qualitative understanding of users behaviors and their expectations. The hypotheses from user research are then extended and verified by a large-scale quantitative analysis of interaction logs. Using music search as a lens, we show that researchers and practitioners can interpret the behavior logs more effectively when accompanied by insights from qualitative research. Further, we also show that user research eliminates the guesswork involved in identifying users signals that estimate user satisfaction. Finally, we demonstrate that metrics identified using our approach are more sensitive than the commonly used click-through rate metric for instant search.

Volume None
Pages None
DOI 10.1145/3331184.3331293
Language English
Journal Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval

Full Text