YouTube is still suggesting videos with misinformation, violent content, and COVID-19 misinformation, according to a major new study published this month. Notably, this isn't just an issue with ...
For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy ...
YouTube users have reported potentially objectionable content in thousands of videos recommended to them using the platform’s algorithm, according to the nonprofit Mozilla Foundation. The findings, ...
YouTube’s algorithm recommends videos that violate the company’s own policies on inappropriate content, according to a crowdsourced study. Not-for-profit company ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. Reading time 3 minutes My YouTube ...
is a senior editor following news across tech, culture, policy, and entertainment. He joined The Verge in 2021 after several years covering news at Engadget. In September 2020 the extension launched, ...
YouTube's recommendation algorithm shows false information and inappropriate videos to its users repeatedly, as per a study by Mozilla. BRAZIL - 2021/02/08: In this photo illustration the YouTube logo ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
YouTube demystifies the Shorts algorithm in a Q&A video that addresses several of the most common questions creators have about gaining visibility with short-form content. We know at least some ...
At the first VidCon since 2019, YouTube’s top earner in the U.S., MrBeast, got on stage to talk to YouTube’s director of discovery Todd Beaupré about one of the biggest questions on creators’ minds: ...