Contrapoints Host Says Youtube Algorithm Isnt Sophisticated Enough To Combat Extremist Content
Conspiracy Contrapoints Youtube The host of prominent left leaning series “contrapoints” expressed doubt on tuesday over whether the platform will ever be able to effectively address extremist content. Has recently had to answer for its algorithm pushing some users toward right wing extremism. now, a growing number of creators are making videos to help stop the radicaliz.
Chatting With R Contrapoints Youtube The host of prominent left leaning series “contrapoints” expressed doubt on tuesday over whether the platform will ever be able to effectively address extremist content. Faces backlash for promoting a pro fascism video, raising alarms about its recommendation algorithms and the normalization of extremist ideologies online. In recent years, there has been a popular narrative in the media that videos from highly partisan, conspiracy theory driven channels radicalize young americans and that ’s recommendation algorithm leads users down a path of increasingly radical content. Not only is 's algorithm failing to combat extremist content, it's recommending more extremist content than gab, an alt right platform: ’s recommendation algorithm is the worst at recommending extremist content out of three popular websites.
How The Youtube Algorithm Works Youtube In recent years, there has been a popular narrative in the media that videos from highly partisan, conspiracy theory driven channels radicalize young americans and that ’s recommendation algorithm leads users down a path of increasingly radical content. Not only is 's algorithm failing to combat extremist content, it's recommending more extremist content than gab, an alt right platform: ’s recommendation algorithm is the worst at recommending extremist content out of three popular websites. ’s algorithm is often accused of putting users in filter bubbles and generating rabbit holes of radicalization. however, evidence on these issues is inconclusive. 's algorithm recommends right wing, extremist videos to users — even if they haven't interacted with that content before — a recent study found. Guillaume chaslot, a former engineer, reveals how 's recommendation algorithm drives users towards more extreme and divisive content by optimizing for watch time, often at the expense of morality and truth. Has recently had to answer for its algorithm pushing some users toward right wing extremism. now, a growing number of creators are making videos to help stop the radicalization process….
Comments are closed.