Yeah there was a period where it noticed I watched a few trans youtubers and so it started promoting anti trans content, it was around the election (I forget if before or after), but either way it disgusted me that they’d do that
Algorithms aren’t that smart. Maybe there needs to be general education about how they work so people will understand that viewing a video is like searching for a topic in google so people will understand how much control over what they see and use the technology appropriately.
If you watch a video about anorexia that’s like typing “Anorexia” in Google or pressing 9 on a remote control, you’re getting channel 9. That’s why to me this sounds crazy. It’s like someone complaining their TV keeps pushing the cooking channel at them.
On a podcast I was listening to a researcher had made an account and subscribed to just like technology and sports and within a few months they had him on like manisphere stuff with tate and those types of assholes
I get some manosphere stuff for the same reason, but mostly ignore it. At the risk of getting beat up here, I like Joe Rogan. His celebrity interviews are entertaining and I like wacky subjects like UFOs and such which he covers. Plus I’m a gamer so I watch videos about guns frequently because I find that fascinating though I don’t want to own one. So that’s why I’ll get recommended manisphere stuff. And I’ll get curious and take a look at those videos, and get more recommendations. But when I stop watching them, they go away. My recommendations are based on whatever I’m interested in at the time.
Everybody talks about Andrew Tate but I have never ever seen any of his videos. Him or Mr. Beast. They never get recommended. I only know about Tate because everybody everywhere else is complaining about him. So I don’t what that researcher did to get that.
Algorithms try to push body negativity if it thinks you’re a teen girl because they tend to scroll more and buy more when depressed.
If it thinks you’re a single man then it tries to push manosphere stuff.
On YouTube I find I my relevant suggestions last about a month before I have to mark a bunch of videos as not interested or block a bunch of channels.
Yeah there was a period where it noticed I watched a few trans youtubers and so it started promoting anti trans content, it was around the election (I forget if before or after), but either way it disgusted me that they’d do that
Algorithms aren’t that smart. Maybe there needs to be general education about how they work so people will understand that viewing a video is like searching for a topic in google so people will understand how much control over what they see and use the technology appropriately.
If you watch a video about anorexia that’s like typing “Anorexia” in Google or pressing 9 on a remote control, you’re getting channel 9. That’s why to me this sounds crazy. It’s like someone complaining their TV keeps pushing the cooking channel at them.
On a podcast I was listening to a researcher had made an account and subscribed to just like technology and sports and within a few months they had him on like manisphere stuff with tate and those types of assholes
I get some manosphere stuff for the same reason, but mostly ignore it. At the risk of getting beat up here, I like Joe Rogan. His celebrity interviews are entertaining and I like wacky subjects like UFOs and such which he covers. Plus I’m a gamer so I watch videos about guns frequently because I find that fascinating though I don’t want to own one. So that’s why I’ll get recommended manisphere stuff. And I’ll get curious and take a look at those videos, and get more recommendations. But when I stop watching them, they go away. My recommendations are based on whatever I’m interested in at the time.
Everybody talks about Andrew Tate but I have never ever seen any of his videos. Him or Mr. Beast. They never get recommended. I only know about Tate because everybody everywhere else is complaining about him. So I don’t what that researcher did to get that.