Have you ever wondered how the YouTube platform works? How do you play your favorite music without putting it on?
Numerous systems that use Artificial Intelligence (AI) control most of the information that circulates on the web. Different platforms observe user behavior and have algorithms that are designed to propose things that might interest them.
The YouTube platform makes a first list of recommendations with hundreds of videos related to the one that the user is watching and then improve this list taking into account their clicks, tastes and other interactions, with the aim of keeping users for longer against the screen.
In the Proceedings of the 13th ACM Conference on Recommender Systems study, researchers gave more importance to the videos found at the bottom of the list, since it is understood that if the User has clicked on those videos because he has spent some time searching for it. Thanks to this modification of the algorithm, they achieved substantial improvements in the platform in both commitment and satisfaction metrics.
Eco cameras: YouTube’s black hole
This type of system falls into one of the major problems generated by algorithms such as the one described: since the system is optimized for users to keep watching videos constantly, it tends to offer recommendations that reinforce user tastes, which creates an experience that excludes other opinions and stimulates the generation of what is called echo chambers; This could reduce a user's exposure to content and ultimately change their worldview.
Pablo Castells, tenured professor of the Higher Polytechnic School of the Autonomous University of Madrid raises regarding this difficulty that the objectives of the user and the companies are not always aligned, since the company needs the user to be happy, but of a way that is profitable. That happens if the user is more connected, but according to Castells, the algorithm does not distinguish when the user is happy and when he has entered a compulsive mode.
In addition, the algorithm has been questioned when talking about children's content. According to an arXiv study, "there is a 45% chance that a young child who follows YouTube recommendations will find an inappropriate video in less than 10 clicks," according to this study, the problem would be that some adult videos use content. of children's videos and the algorithm does not differentiate them.
Castells argues that this difficulty would be solved by identifying the types of content and labeling some content as inappropriate, which immerses us in an ethical debate in which the platforms are currently involved.
Recent comments