Social media platforms, particularly YouTube, have emerged as pivotal arenas for the exchange of opinions and ideologies, presenting both opportunities and challenges. Among these challenges, there is the spreading of extremist content facilitated by recommender systems. Understanding the role of recommendation systems underlying radicalization paths on YouTube is essential for mitigating its spreading. This study addresses the challenges of identifying and understanding the level of radicalization in recommendation videos by introducing an indicator that reflects various aspects of video attributes and user behaviours. This study introduces a pipeline combining explainable artificial intelligence techniques and prediction modeling to evaluate the potential of the introduced indicator for determining diversity within recommendation lists of up-next videos. Experimental results show the influence of video attributes on choosing radicalized up-next videos, that often leads to form radicalization paths in recommendation networks. Further experiments demonstrate the approach potential in predicting diversity levels within recommendation lists by incorporating identified factors with user watching history. The results show that the proposed comprehensive approach not only enhances strategies for helping to mitigate radicalization, but also provides valuable indicators for detecting its presence within recommendation systems.
Understanding radicalization pathways: a framework for assessing diversity in YouTube recommendation systems
Berjawi O.;Cavaliere D.;Fenza G.;Loia V.
2024-01-01
Abstract
Social media platforms, particularly YouTube, have emerged as pivotal arenas for the exchange of opinions and ideologies, presenting both opportunities and challenges. Among these challenges, there is the spreading of extremist content facilitated by recommender systems. Understanding the role of recommendation systems underlying radicalization paths on YouTube is essential for mitigating its spreading. This study addresses the challenges of identifying and understanding the level of radicalization in recommendation videos by introducing an indicator that reflects various aspects of video attributes and user behaviours. This study introduces a pipeline combining explainable artificial intelligence techniques and prediction modeling to evaluate the potential of the introduced indicator for determining diversity within recommendation lists of up-next videos. Experimental results show the influence of video attributes on choosing radicalized up-next videos, that often leads to form radicalization paths in recommendation networks. Further experiments demonstrate the approach potential in predicting diversity levels within recommendation lists by incorporating identified factors with user watching history. The results show that the proposed comprehensive approach not only enhances strategies for helping to mitigate radicalization, but also provides valuable indicators for detecting its presence within recommendation systems.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.