Explore, Exploit, Explain: Personalizing Explainable Recommendations with Bandits - Spotify Research
The multi-armed bandit is an important framework for balancing exploration with exploitation in recommendation. Exploitation recommends content (e.g., products, movies, music playlists) with the highest predicted user engagement and has traditionally been the focus of recommender systems. Exploration recommends content with uncertain predicted user engagement for the purpose of gathering more information. The importance of... View Article
https://research.atspotify.com/publications/explore-exploit-explain-personalizing-explainable-recommendations-with-bandits/