IEEE transactions on cybernetics | 2021

Deep Reinforcement Learning Framework for Category-Based Item Recommendation.

 
 
 
 
 
 

Abstract


Deep reinforcement learning (DRL)-based recommender systems have recently come into the limelight due to their ability to optimize long-term user engagement. A significant challenge in DRL-based recommender systems is the large action space required to represent a variety of items. The large action space weakens the sampling efficiency and thereby, affects the recommendation accuracy. In this article, we propose a DRL-based method called deep hierarchical category-based recommender system (DHCRS) to handle the large action space problem. In DHCRS, categories of items are used to reconstruct the original flat action space into a two-level category-item hierarchy. DHCRS uses two deep Q-networks (DQNs): 1) a high-level DQN for selecting a category and 2) a low-level DQN to choose an item in this category for the recommendation. Hence, the action space of each DQN is significantly reduced. Furthermore, the categorization of items helps capture the users preferences more effectively. We also propose a bidirectional category selection (BCS) technique, which explicitly considers the category-item relationships. The experiments show that DHCRS can significantly outperform state-of-the-art methods in terms of hit rate and normalized discounted cumulative gain for long-term recommendations.

Volume PP
Pages None
DOI 10.1109/TCYB.2021.3089941
Language English
Journal IEEE transactions on cybernetics

Full Text