Abstract: | Artificial Intelligence (AI) has witnessed remarkable advancements in recent years, catalyzing
transformative changes across various domains. Within this landscape, recommendation
systems have emerged as a cornerstone of AI, revolutionizing the way machines
understand and predict user preferences.
This report introduces a novel approach to modeling recommendation systems using Deep
Reinforcement Learning (DRL). Traditional recommendation systems, such as collaborative
filtering and content-based filtering, face limitations like cold-start problems, sparse
user-item interactions, and a lack of adaptability to evolving user preferences. To address
these challenges, this study leverages foundational principles of DRL to develop an
innovative User-Movie Embedding model, integrated into a reinforcement learning setup
using an Actor-Critic approach.
The report details the offline environment, agent architecture, and training process, showcasing
how the Actor-Critic algorithm, combined with the User-Movie Embedding model,
can significantly enhance recommendation performance. Through comprehensive experiments
and analysis, the study demonstrates the advantages of this approach in terms of
adaptability and long-term user satisfaction. |