Why is youtube recommending conservative "talking points" to me?

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

80085,

If I use a private window, and don’t log in I get a lot of right-wing stuff. I’ve noticed it probably depends on IP/location as well. If at work, youtube seems recommend me things other people at the office listen to.

If I’m logged in, I only get occasional right-wing recommendations interspersed with the left-wing stuff I typically like. About 1/20 videos are right-wing.

YouTube Shorts is different. It’s almost all thirst-traps and right-wing, hustle culture stuff for me.

It could also be because a lot of the people who watch the same videos you do tend to also watch right-wing stuff.

In general, the algorithm tries to boost the stuff that maximizes “engagement,” which is usually outrage-type stuff.

IDe,

The best way to tune the algorithm on Youtube is to aggressively prune your watch/search history.
Even just one “stereotypical” video can cause your recommendations to go to shit.

EssentialCoffee,

Like others have said, the things you watch are prime interests for right wing in the US. You have to train the algorithm that you don’t want it.

GoodEye8,

I think the algorithm is so distorted by right-wing engagement that it will end up recommend right-wing content, even if you actively try to avoid it. I watch youtube shorts and I always skip if it’s Shapiro, Peterson, Tate or Pierce Morgan. I also skip the moment I feel like the shorts might be right-wing. Scroll enough and eventually the algorithm will go “How about some Shapiro, Peterson, Tate or Morgan?” Give it enough time and it will always try to feed you right-wing content.

EssentialCoffee,

I suppose if you do nothing but scroll YouTube endlessly, it just starts to recommend anything to keep you in the platform.

But I’ve had an account since they started and even watch The Young Turks and C-Span here and there and almost never get anything political, let alone right wing.

My algorithm is at the point where I get Korean commercials, which is honestly fine with me.

Korne127,
@Korne127@lemmy.world avatar

Social Media in general has that habit of furthered spreading far right content and dragging people into such content bubbles.

erogenouswarzone,
@erogenouswarzone@lemmy.ml avatar

If they piss you off, you will stay on their platform longer, and they make more money.

That is the sad truth of EVERY social network.

Lemmy might not be that advanced yet, but as soon as they get big enough to need ads to pay for bandwidth and storage, soon after they will add algorithms that will show you stuff that pisses you off.

One way to combat this is to take a break from the site. Usually after a week, when you come back it will be better for a while.

EssentialCoffee,

I think it has more to do with the stuff you watch than wanting to piss you off.

All YouTube recommends to me are videos of kpop, dog grooming, Kitten Lady, and some Friesian horse stable that went across my feed once. Oh, and some historical sewing stuff.

If they started recommended stuff that pissed me off, I wouldn’t bother going back except for direct videos linked from elsewhere.

Edit: Rereading what OP said they watch, their interests are primary interests of the right wing in the US. If they don’t train the algorithm they don’t want it, the algorithm doesn’t know that those interests don’t intersect.

marmo7ade,

Lemmy might not be that advanced yet, but as soon as they get big enough to need ads to pay for bandwidth and storage, soon after they will add algorithms that will show you stuff that pisses you off.

Who is the “they” that is going to implement what you claim? And how are they going to do that, specifically? Lemmy isn’t reddit or youtube, technically. There is no central authority. Lemmy.world can’t change how this technology works just because they might want to start injecting recommendations or ads. That’s the point of this system.

erogenouswarzone,
@erogenouswarzone@lemmy.ml avatar

Who is hosting this? Lemmy.ml, all the federated sites? With the reddit exodus there is probably a lot more activity. Who’s paying for that? That’s who they would be in this situation I think.

ChaoticEntropy,
@ChaoticEntropy@feddit.uk avatar

I almost never allow it

The times you do allow it are all the algorithm cares about, sadly. Any kind of engagement is great for companies.

“Hate Rogan? Cool, watch some Rogan as hateporn, hate watching is still watching.”

AlexWIWA,

If you watch any kind of gaming videos, and haven’t trained your algorithm, then you’ll get flooded with this shit

kava,

I get right wing stuff only on YouTube shorts typically. And I think it’s because I’ll watch them. I find it interesting in a detached sense.

Good to know what you’re up against. Same reason I try and watch as many Trump speeches as I reasonably can.

Dohnakun,

Just ignore the recommendations, it’s mostly bulkshit anyway.

omidmnz,

addons.mozilla.org/…/youtube-recommended-videos/Unhook "ignore"s them for me! It is available for other browsers too.

stiephel,

I’m 30 and have a small family, too. When I watch shorts on YouTube I get the exact same content you’re describing. None of the long videos I’m watching are political, yet the Algo keeps throwing them at me. I get a lot of Jordan Peterson crap or lil Wayne explaining how there’s no racism. I hate it.

AFaithfulNihilist,
@AFaithfulNihilist@lemmy.world avatar

The lil Wayne stuff is so strange. Usually when some specific celebrity pops up in my feed I assume their publicist is rehabbing their image after some public incident or recently exposed private conflict. YouTube shorts isn’t like that.

YouTube shorts seems to be exclusively just a pipeline to rightwing talking points and the unfunniest parts of stand up comedy framed to serve the same rightwing pipeline.

whodoctor11,
@whodoctor11@lemmy.world avatar

Man I would turn off the activity collect and erase it’s history in myaccount.google.com, besides do the same thing with anything else, like YouTube history. Bad things still show up, but you receive more things related to your subscriptions. When you receive bad things just hit the three points and go to “Don’t recommend this channel” (for me is more effective than “Don’t recommend content like that”).

That’s a very common YouTube complain and have a simple explanation: this alt-right folks are the top adversities on social media. They pay millions to Google to do this.

yoz,

This can’t be any more clearer

Android TV- install smartnexttube Android phone- Set private DNS to Nextdns and block all ads + install firefox w/ unlock origin Or if you want an app install - newpipe from fdroid app store

Machinist,
@Machinist@lemmy.world avatar

Machining and blacksmithing are highly correlated with right wing BS in the US. Check my uname and ask me how I know. 😁

FierroGamer,

This one may not be so obvious, but I’ve seen someone lose their marbles over the crazy idea that they get recommended stuff for children when all they watch on YouTube is Minecraft videos and sometimes Roblox. And people in the comments agreed…

The YouTube subreddit wasn’t full of bright people.

SeaJ,

Your interests have a strong correlation with people on the right aside from maybe react videos.

But even if your interested were not so strongly correlated with the right, you would probably still get right wing ads or videos suggested. They garner the highest engagement because it is often outrage porn. Google gets their money that way. My subscriptions are to let wing political channels, science, and solar channels but I still get a decent amount of PragerU and Matt Walsh ads. Reporting then does not stop them from popping up either.

cuppaconcrete,
@cuppaconcrete@aussie.zone avatar

Yeah big tech loves to throw dumb stuff your way to piss you off and keep you engaged, even if you’ve never shown an interest before.

Hikermick,

“The calls are coming from inside the house”~~__****

  • All
  • Subscribed
  • Moderated
  • Favorites
  • asklemmy@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #