Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.
Bro people were eating tidepods and we saw a resurgence of nazism and white nationlism.
I think we at least know the effects of what was happening before.
Could we convince the Nazis to eat the tide pods?
…probably
They are a famously suggestible lot
…something something… making your whites whiter… they’ll get the message they’re after
Weird. Youtube keeps recommending right wing videos even though I’ve purged them from my watch history and always selected Not Interested. It got to the point that I installed a 3rd party channel blocker.
I don’t even watch too many left leaning political videos and even those are just tangentially political.
i think if you like economics or fast cars you will also get radical right wing talk videos. if you like guns it’s even worse.
Oh, you like WW2 documentaries about how liberal democracy crushed fascism strategically, industrially, scientifically and morally?
Well you might enjoy these videos made by actual Nazis complaining about gender neutral bathrooms!
Nah. Cars or money has nothing to do with it. I’ve never once gotten any political bullshit and those two topics are 60% of what I watch.
i made a fresh google account specifically to watch daily streams from one stocks channel (the guy is a liberal) and i got cars, guns, right wing politics in the feed.
my general use account suggestion feed is mostly camera gear, leftist video essays and debate bro drama.
Uncle Bruce?
GET OUT OF MY HOUSE
Bargoooooons!!
I started to get into atheist programs and within a month I was getting targeted ads trying to convert me.
I started getting into Motorsport recently. I just get the ID video essay on racing and videos similar to top gear like overdrive. I don’t get any right wing stuff or guns. But I’m also in the UK so it probably uses that too. For American maybe it’s like “ah other Americans that line fast cars also like guns, here you go”
I’ve been watching tutorials on jump ropes and kickboxing. I do watch YouTube shorts, but lately I’m being shown Andrew Tate stuff. I didn’t skip it quick enough, now 10% of the things I see are right leaning bot created contents. Slowly gun related, self defense, and Minecraft are taking over my YouTube shorts.
If you don’t already, you can view your watch history and delete things.
I do that with anything not music related, and it keeps my recommendations extremely clean.
I know everyone likes to be conspiracy on this but it’s really just trying to get your attention any way possible. There’s more right wing popular political videos, so the algorithm is more likely to suggest them. These videos also get lots of views so again, more likely to be suggested.
Just ignore them and watch what you like
I’ve already said I installed a channel blocker to deal with the problem, but it’s still annoying that a computer has me in their database as liking right wing shit. If it was limited to just youtube recommendations, it would be nothing, but we’re on a slow burn to a dystopian hell. Google has no reason not to use their personality profile of me elsewhere.
I made this comment elsewhere, but I have a very liberal friend who’s German, likes German food, and is into wwii era history. Facebook was suggesting neo-nazi groups to him.
I watch a little flashgitz and now I’m being recommended FreedomToons. I get that’s some people that like flashgitz are going to be terrible, but I shouldn’t have to click Not Interested more then once.
What are you using to block channel’s ?
I noticed when I went to a hotel that the recommended videos for a logged out user were drastically different than my own. For example, I always found it a bit odd that Mr Beast is the #1 person on YouTube, yet I almost never get recommended his videos, but they were all over the TV in the hotel.
I decided to try a hard reset. I deleted my entire watch history, start at 0 again. I also deleted all but maybe 5 of my subscriptions. Almost nothing changed.
Indicating “not interested” shows engagement on your part. Therefore the algorithm provides you with more content like that so that you will engage more.
You can try blocking the channel, which has mixed results for the same reason, or closing youtube and staying away from it for a few hours on that account.
If I click Not Interested increases the likely hood of getting more of the same, then all the more reason to run ad blockers.
The Channel Blocker is a 3rd party tool. It just hides the channel from view. Google shouldn’t know I’m doing it.
I don’t know if this is accurate or not, but it’s the most nonsensical thing I’ve heard in a while. If engaging with something to say, “I don’t want to see this,” results in more of that content - the user will eventually leave the platform. I’m having this concern right now with my Google feed. I keep clicking not interested, yet continue getting similar content. Consequently, I’m increasingly leaning toward disabling the functionality because I’m tired of fucking seeing shit I don’t care to see. Getting angry just thinking about it.
I can only offer my own experience as evidence, but this is what I was advised to do (stop engaging by not selecting anything) and it worked. Prior to that I kept getting tons of stuff that I didn’t want to see, but it stopped within a few days once I stopped engaging with it. And I agree, it is infuriating.
Because I got this advice from someone else, I guess it has worked for others too.
All my YouTube recommendations went downhill about 3 years ago. I am bombarded by rightwing Christian stuff no matter how many times I flag and complain.
I’m bombarded by Joe Rogan stuff. I keep blocking the channels but there is an endless stream of them
I always downvote, then block the channel whenever I get those. However, I think the mere act of going to the button to block the channel instead of just scrolling on immediately is telling the algorithm that I want more of that kind of video.
Watching a bit of breadtube stuff, I feel like thr algorithm can’t determine what video is against stuff like that and what’s for, so I get recommended videos for whatever I don’t like instead of against.
I am also suspecting that downvoting or blocking is somehow interpreted as “engaged with the content so lets shove more of it”
Weirdly, YouTube’s algo propelled me down the Pinko-commie anarcho-socialist boy-we-suck-at-democracy rabbit hole. I was an avid BreadTuber long before I ever heard the name BreadTube.
Yeah, it started for me during Covid when I felt like I needed long-form podcasts/streamers in the background for noise while working from home. I think my progression was from The Worst Year Ever -> Chapo Trap House -> It Could Happen Here -> PhilosphyTube -> Contrapoints -> Vaush. TBF, I’ve been a leftist before Youtube existed, probably starting with Chomsky, Einstein’s article, and random pirated documentaries.
I keep getting ‘rescue’ animal videos which involve people purposely putting puppies and kittens in destressing situations so they can ‘save them’ its sick and no matter how often i block and report those videos they re-appear next month. I also get alot of ‘police shooting people’ videos which i also try to block
I think it’s just a matter of fine tuning your preferences. I’ve never an irrelevant video recommended to me within the last few years. All the recommendations have been great. Retro hardware reviews, video game gameplay guides, science videos, and other informational/engineering stuff.
because tiktok replaced yotube in that regard
Who did these stats, I’m getting more right wing proganda than ever. Also Facebook is just as bad as ever. I really like stuff like the fediverse since I can control my feed.
Me, too. I’m always recommended Joe Rogan or Jordan Peterson videos, with a sprinkling of Ben Shapiro. I even got someone claiming the holocaust was overblown (i reported them). All within the past few months.
I don’t get recommended regular videos like that, but youtube shorts are full of that garbage. I suspect it’s a blind spot
If it’s true that they have closed the radicalization rabbit hole then that is a huge achievement and very very good news.
Now that they’ve entrenched an entire alternative universe in an election-winning proportion of the population, they don’t need it anymore.
Unless YouTube is going to be deliberately directing people to deprogramming content it’s too late.
A lot of damage is done, certainly, but I think any success they have will depend on keeping up this bullshit. New voters are growing up all the time. The less chance for them to fall down the QAnon conspiracy after they just wanted to find some video game guide content, the better.
I still fall down YouTube rabbit holes all the time, just not ones for radicalization, which I don’t think I’ve ever actually seen.
Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:
“We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”
That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.
I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.
I was aware of this study when they presented it virtually (can’t remember where), and while I don’t have an issue with their approach and results, I’m more concerned about the implications of these numbers. The few percent that were exposed to extremist content may seen small. But scaling that up to population level, personally that is worrisome to me … The impact of the few very very bad apples can still catastrophic.
Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.
Never had this problem, never used YT while logged on. Just incognito and careful search keywords. If the algo recommended something sus, immediately close your incognito session and open a new one.
Removed by mod
I HAVE THE SHINEYIST MEAT BYCYCLE
What a load of dog dung that article is. Justifying censorship, labelling everything that is not liked by some politicial “expert” as far right extreme.
When an algorithm is involved, things change. These aren’t static websites that only get passed around by real people. This is some bizarre pseudo intelligence that thinks if you like WWII history and bratwurst, that you’d also like neo-nazi content. That’s not an exaggeration. One of my very left leaning friends started getting neo-nazi videos suggested to him and I suspect it was for those reasons.
Also, youtube isn’t a free speech platform. It’s an advertisement platform. Fediverse is a free speech platform, although it’s free speech for the person paying the hosting bills.
Censorship of online content is good, but simultaneously the censorship of sexually explicit books in elementary schools is evil.
Neat.
deleted by creator
Solid counterargument.