Why is youtube recommending conservative "talking points" to me?

Hi, I am a guy in early thirties with a wife and two kids and whenever I go on youtube it always suggests conservative things like guys dunking on women, ben shapiro reacting to some bullshit, joe rogan, all the works. I almost never allow it to go to that type of video and when I do it is either by accident or by curiosity. My interest are gaming, standup comedy, memes, react videos, metalurgy, machining, blacksmithing, with occasional songs and videos about funny bullshit and I am not from america and I consider myself pretty liberal if I had to put it into terms used in america. But european liberal, so by american standards a socialist. Why does it recommend this shit to me, is this some kind of vector for radicalization of guys in my category? Do you have similar experience?

angrymouse,

You did not need to watch something to be bombarded with similar content, Youtube recommends things that are watched by ppl that watch things you watch (sorry about that). And it seems to considerate the overall popularity, at leat for me, so it usually recommends stupid popular right wing things just because it is overall popular and happens to be watched by a lot ppl that also watch dota 2 for example. I had to disable YouTube tu use my history to suggest content, my front page is full of things that I already watched from my subscriptions but for its better than YouTube stupid suggestions.

reddig33,

Could be paid promotion. I get a lot of suggestions in my feed for some really awful music in genres that I never listen to. I wouldn’t be surprised if the record label is paying to put it there.

ablackcatstail,
@ablackcatstail@lemmy.goblackcat.com avatar

I am guessing you probably viewed enough of these videos that YouTube’s dumb algorithm is like, “Oh hey @V01t45 wants to see right wing stuff so let’s show him that.” I agree that it is very annoying. This is why we need to rally behind starting to use PeerTube and cancelling YouTube.

const_void,

Because ‘conservative’ content gets a lot of engagement (ie ad money). The more they recommend it the bigger the audience, the bigger ad payout. They’re literally monetizing hate.

notenoughbutter,

you can try getting another advertisement I’d

search on duckduckgo how to

NAS89,

Follow up question: why won’t YouTube Shorts trust me when I say I don’t want to be recommended channels.

So…many…dancing…trends. And they keep sending more.

NightOwl,

YouTube algorithm only cares about engagement not likes or dislikes. It has a neutral impression of likes and dislikes and only cares if people are actively leaving impressions. Whether it’s from liking with joy or disliking with anger engagement is a sign to show more. I’ve heard contrary to logic it’s better to just skip immediately and not press anything signifying reception to the content shown, since it’ll perceive it as recommendation to show more.

NAS89,

Yeah I can understand that for likes/dislikes or comments of “this is dumb”, but after hundreds of “do not recommend this channel”, the algorithm should be able to tell a lack of interest in a particular content.

eric5949,

Even if you watch leftwing videos sometimes YouTube just goes “oh you’re a young-ish male who’s into politics? Here’s som nazi vids you’ll like.” ESPECIALLY if the leftwing videos talk about righty’s.

Contramuffin,

I would wager a guess that it’s your regular interests. YouTube sees that people who like machining, blacksmithing, etc. have a good chance of also being conservative. You probably are just part of the odd cases where you like those hobbies but aren’t conservative.

Your post raises an interesting point, though: even if YouTube didn’t intend for their algorithm to be a pipeline for radicalism, simply by encouraging engagement and viewership, their algorithm ends up becoming a radicalization pipeline anyways.

The_Picard_Maneuver,
@The_Picard_Maneuver@lemmy.world avatar

My YouTube recommendations are usually spot on, but I do get Joe Rogan videos sometimes. I could see people sliding into radicalizing garbage easily from there. Rogan’s so big that he gets cool guests, but they’re wasted on him as a host.

Monkeyhog,

You need to learn how to properly prune your feed. I got some of that stuff briefly, but I kept blocking it and choosing not interested and eventually it stops. My feed shows me nothing I dont want now. Its just a matter of shaping it into what you want.

ImplyingImplications,

metalurgy, machining, blacksmithing

That could be it. I don’t know YouTube’s algorithm, but typically they work by finding what other users watch the videos you watch and recommending you other videos those people also watched. I wouldn’t be surprised if the guys watching blacksmithing videos also tend to watch Joe Rogan and the like.

Lemmylefty,
@Lemmylefty@lemmy.world avatar

I’d argue that gaming and memes are much bigger contributors.

NightOwl,

Algorithms. It’s why I use newpipe on mobile and freetube on desktop. Both allow me to import and export my subscription list, and block ads and have sponsorblock support.

But, most importantly it doesn’t make use of the Google algorithm to try and recommend me videos that might be of the radical nature on misinterpretation of past viewing history.

jballs,
@jballs@sh.itjust.works avatar

You can see what Google (thinks it) knows about you.

  • Go to your Google Account (myaccount.google.com)
  • Manage your Google Account
  • Select Privacy and personalisation.
  • Under this Data & privacy page you’ll find History Settings, Ad Settings, and more.
  • For example, go to Ad Settings and click on Ad Personalization.
  • Now you’ll see How your ads are personalized.

I think you can even remove stuff if you want.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • asklemmy@lemmy.ml
  • localhost
  • All magazines
  • Loading…
    Loading the web debug toolbar…
    Attempt #