YouTube Video Recommendations Lead to More Extremist Content for Right-Leaning Users, Researchers Suggest::New research shows that a person’s ideological leaning might affect what videos YouTube’s algorithms recommend to them. For right-leaning users, video recommendations are more likely to come from channels that share political extremism, conspiracy theories and otherwise problematic content.
I’m not sure it’s just right leaning users. I’m pretty far to the left and I keep ketting anti-trans, anti-covid right wing talking points quite frequently. I keep pressing thumbs down but they keep coming.
What YouTube sees:
“These videos keep eliciting reactions from users, which means that they prefer to engage with this content. This bodes well for our advertisers.”
Exactly. If you don’t want to see them, best thing is to ignore them.
I actually report every single one of them, usually the channel too, for hate speech. that seems to keep them out of my feed
deleted
Thumbs down actually makes Youtube recommend more stuff to you because you’ve engaged with the content. As said by other people, best way to avoid recommendations you don’t want is to just ignore the content and not watch it, if it’s being recommended to you then click the three dots on the video thumbnail and choose “not interested”/“don’t recommend channel”.
They’re supposed to enrage you so you use their platform longer, hate-share the videos so others use their platform, etc. They know what they’re doing.
Which is why I don’t share it, and I downvote it.
Clearly what I need to do next is immediately kill the tab or close the app.
I have definitely been hitting that “don’t recommend this channel to me” button noticeably more frequently lately…
I’d be really curious to know why this seems to be happening to so many people but not me. I’m a hardcore YouTube addict, but there’s zero politics in my feed. I even follow many right-wing gun tubers, watch plenty of police bodycam footage and occasionally might even view one or two videos from people like Jordan Peterson, Joe Rogan and Ben Shapiro but even after that I might only get few more recommendations about their videos and once I ignore them they stop showing up. The only videos YouTube seems to be trying to force feed me are game streamers I’ve never heard of and judging by the view count on their videos, neither has anyone else.
You might be in a different test group. They always have a few different groups with different settings to check how well the algorithm is meeting their goals. That’s how they know they make less money if they don’t radicalize people.
Yeah I’ve gotten similar results too. Fwiw, I don’t think downvoting is a good way to change your results. It seems to key into any interaction at all and also watch time. As soon as I see certain people I started just swiping immediately
As tempted as I am to reply “Well, duh,” I suppose it’s good that we’re getting research to back up what we already knew.
Yep. Every time I open YouTube, before signing in, the much of the front page is just far-right conspiracies, blatant misinformation, and other sketchy content.
Mines all cycling, music videos, movie trailers, and lofi music. What are we doing differently?
Edit: oh, “before signing in”
Edit 2: I loaded it in an incognito window and it just says “youtube watch history is off” with no recommendations. I really think it’s you man
Edit 2: I loaded it in an incognito window and it just says “youtube watch history is off” with no recommendations. I really think it’s you man
Might be something to do with Chrome or that. I’m on Firefox and get a filled out home page when first loading youtube (no history or cookies, mid levels of fingerprint blocking). After logging in, I get the notification for disabled watch history, but not before.
Even before signing in YouTube well still try to recommend what the overall household watches by matching it up to the IP address. It’s also why when signing in you sometimes get recommended what someone else in your family watches even if you don’t. Same thing happens when you’re not signed in too.
They optimize recommendations to a large degree to induce anger and rage, because anger and rage are the most effective ways to drive platform engagement.
Facebook does the same.
We also have no idea what measures they take to stop the system being manipulated (if any).
The far-right could be working to ensure they’re recommended as often as possible and if it just shows up as “engagement” or “impressions” on their stats, YouTube is unlikely to fight it with much enthusiasm.
It’s such a slippery slope. I avoid anything mildly right leaning to keep my algorithm clean. I sometimes watch stuff incognito to preserve my accounts algorithm. What a world…
Removed by mod
Subjective biases can play a huge part in stuff like this. The researchers behind this story had to go through a bunch of YouTube channels and determine whether they constitute extremist right wing content or not.
I think it’s a safe assumption that if you took the people consuming that content and asked them whether the video they just watched was right wing extremist content, most of them would say no.
So, it’s possible that you don’t think you’re being overwhelmed with right wing extremist content, but that somebody else looking at your viewing history might think you are.
They don’t do it to everyone. Some people get put in test groups that get ‘nice’ algorithms that don’t try to make you angry, so they can measure the effect on their revenue.
It is entirely possible that YouTube’s algorithm doesn’t see you as someone interested in right-wing rhetoric (or perhaps you may have also downvoted such videos).
They explain how it works here (without technical details): https://blog.youtube/inside-youtube/on-youtubes-recommendation-system/
I am so sick of getting redpilled on YouTube Shorts. On the one hand I know that having the algorithm present content from across the political spectrum is healthy for discourse in some ways, I have had enough Ben Shapiro for my lifetime, thank you. Also it doesn’t seem like the algorithm learns well from the “do not recommend this channel to me” function.
Isn’t that basically the whole point of the algorithm? Isn’t it behaving exactly as advertised?
You don’t even have to feed the algorithm to get those videos. I have my history turned of so I don’t get any suggestions on my home page anymore, but when I’m watching a video, the suggestions on the side invariably have a handful of right-wing idiots. You can sometimes see how YT might think they’re related to what I’m watching (usually retro tech stuff), but they never actually are. I rarely see the same misfires with the left-wing videos. My guess is the brain-dead right-leaning viewers put that content in the high engagement buckets so they just get suggested more often. I don’t think left leaning people engage much with the left wing media because it’s usually boring politics that don’t infringe on basic human rights, and we already know how bad the right is just by seeing them suck with our own eyes, we don’t need to be told about it over and over to believe it, or to get bullshit “gottcha” material for water cooler conversations. We also already see how the politicians on the left suck in their own special ways without needing anyone to explain it to us. So what’s the point in investing in a video when a few words, or none at all, will do the trick? Algorithms target idiots with unfounded rage, it’s just that simple, I wouldn’t even call it an algorithm, just basic number crunching.
deleted by creator