According to a new report, YouTube’s recommendation algorithm is directing kids to videos of guns and school shootings.
In 2021, Christos Goudreaux, YouTube’s vice president of engineering, said the company has “made it a top priority to make responsible recommendations,” saying its algorithms don’t direct viewers to extremist content. However, according to the Campaign for Accountability (CFA) Technical Transparency Project, this is not the case.
The team created four accounts, posing as two 9-year-old boys and two 14-year-old boys, and created playlists for each that consisted solely of gaming videos. For younger users, those were Roblox, Lego Star Wars, and Five Nights at Freddy’s, while older accounts watched videos from shooters like Grand Theft Auto, Halo, and Red Dead Redemption.
The researchers then recorded and analyzed the videos recommended by YouTube’s algorithm. One 9-year-old and another 14-year-old watched the recommended videos, while the others did not. They found that YouTube provided content about shooting and weapons to all four accounts, but at a much higher volume for the two guys who clicked on recommended videos on YouTube.
“It’s bad enough that YouTube makes videos glorifying gun violence more accessible to children. We’ve now discovered that it recommends these videos to young people,” said Michelle Coopersmith, executive director of the Accountability Campaign. “Unfortunately, this is only the latest example of Big Big’s algorithms Tech feeds some of the worst content to kids in an endless search for engagement.”
YouTube said in a statement that it has robust monitoring processes in place and that researchers’ activity may not accurately reflect real-life user behavior.
“We offer a number of options for young viewers, including the standalone YouTube Kids app and Moderated Experience tools, designed to provide a safer experience for preteens and teens whose parents have decided they are ready to use the main YouTube app,” said a YouTube spokesperson.
“We want our recommendations to be well-researched, and we are exploring more opportunities to bring in university researchers to study our systems. But given the methodology of this report, it’s hard for us to draw any strong conclusions. For example, the study doesn’t specify the total number of videos recommended for demo accounts, nor is it It does not provide any indication of how to set up the test accounts, including the use of experience tools.Under the supervision of YouTube. »
Recommended videos showed school shootings and other mass shootings, graphic images showing how harmful firearms can be to the human body, and how-to guides for converting a handgun into a fully automatic weapon. Researchers say YouTube has repeatedly recommended an R-rated movie about the young lives of serial killer Jeffrey Dahmer.
These videos were pushed even further to accounts that shared recommended videos — in some cases, more than 10 times. The videos don’t seem to have any age restrictions. In November, YouTube sent 382 live gun videos to the 9-year-old sharing account, averaging more than 12 a day, while the teen’s account received 1,325. In contrast, the 14-year-old account that didn’t Clicks on recommended content received 172 weapon videos.
“Violent video games have long been blamed for driving mass shootings in the United States, although there is no real evidence to support the link,” Coopersmith said. However, YouTube’s algorithms seem intent on glorifying real-world gun use for boys under the age of 9, at a time when mass shooters are getting younger and younger.
Translated article from the American magazine Forbes – Author: Emma Woollacott
<< اقرأ أيضًا: ستحذف Google حسابات Gmail و YouTube القديمة التي لم يتم استخدامها لمدة عامين >>>