YouTube’s algorithm still amplifies violent videos, hateful content and misinformation despite the company’s efforts to limit the reach of such videos, according to a study published this week. The ...
Posts from this topic will be added to your daily email digest and your homepage feed. is a senior editor following news across tech, culture, policy, and entertainment. He joined The Verge in 2021 ...
YouTube is still suggesting videos with misinformation, violent content, and COVID-19 misinformation, according to a major new study published this month. Notably, this isn't just an issue with ...
For years YouTube’s video-recommending algorithm has stood accused of fuelling a grab bag of societal ills by feeding users an AI-amplified diet of hate speech, political extremism and/or conspiracy ...
"If you randomly follow the algorithm, you probably would consume less radical content using YouTube as you typically do!" So says Manoel Ribeiro, co-author of a new paper on YouTube's recommendation ...
Researchers found that clicking on YouTube’s filters didn’t stop it from recommending disturbing videos of war footage, scary movies, or Tucker Carlson’s face. Reading time 3 minutes My YouTube ...
YouTube's recommendation algorithm shows false information and inappropriate videos to its users repeatedly, as per a study by Mozilla. BRAZIL - 2021/02/08: In this photo illustration the YouTube logo ...
Viewership of crypto content on YouTube has declined to its lowest level since January 2021 following a sharp retreat over the past three months. On Sunday, ITC Crypto founder Benjamin Cowen shared a ...
If Nielsen stats are to be believed, we collectively spend more time in front of YouTube than any other streaming service—including Disney+ and Netflix. That's a lot of watch hours, especially for an ...
If you’ve noticed less crypto-related content in your feed, you’re correct. Crypto YouTube engagement has recently fallen to ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results