Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>YouTube is a _disastrously_ unhealthy recommender system,

Can you explain with more details?

I use Youtube as a crowdsourced "MOOC"[0] and the algorithms usually recommended excellent followup videos for most topics.

(On the other hand, their attempt at matching "relevant" advertising to the video is often terrible. (E.g. Sephora makeup videos for women shown to male-dominated audience of audiophile gear.) Leaving aside the weird ads, the algorithm works very well for educational vids that interests me.)

[0] https://en.wikipedia.org/wiki/Massive_open_online_course



Yes. Elsagate is an example - the creepy computer-generated violent and disturbing videos that eventually follow children's content - or the fact that just about every gaming-related video has a recommendation for an far-right rant against feminism or a Ben Shapiro screaming segment. There's also the Amazon problem - where everything related to the thing you watched once out of curiosity follows you everywhere around the site.


>Elsagate is an example,

Yes, I was aware of Elsagate.[0] I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.

I guess I should have clarified my question. I thought gp's "unhealthy" meant Youtube's algorithm was bad for somebody like me that views mainstream non-controversial videos. (Analogy might be gp (rspeer) warning me that abestos and lead paint is actually cancerous but public doesn't know it.)

[0] https://news.ycombinator.com/item?id=20090157


> I don't play games so didn't realize every gaming video ends up with unwanted far-right and Ben Shapiro videos.

They don't. That's confirmation bias at work.


It's not 100%, but I'd consider "video games" => "Ben Shapiro" to be a pretty awful recommendation system, regardless of the reasoning behind it. As far as I know, the group "video gamers" doesn't have a political lean in either direction.

I've definitely seen this with comics. I watched a few videos criticizing Avengers: Infinity War, and now I see mostly Ben Shapiro recs. It makes no sense. I never have (and never plan to) seek out political content on YouTube.


I watch a number of gaming videos and have never had a far-right video recommended. Don't know who Ben Shapiro is.

It could be the type of games involved, since I usually watch strategy, 4x, city-building, and military sims. I usually get history-channel documentaries or "here's how urban planning works in the real world" videos recommended, which suits me fine. Somebody whose gaming preferences involve killing Nazis in a WW2-era FPS might be more likely to get videos that have neo-Nazis suggesting we kill people.


Some of the child comments of your thread mention the nazi problem.


But that child comment didn't link Nazis to normal "video games". I assumed he just meant some folks (e.g. "1.8%" of web surfers) with the predilection for far-right videos would get more Nazi recommendations. Well yes, I would have expected the algorithm to feed more of what they seemed to like.

I do not see any Nazi far-right videos in 1.8% of my recommendations ever.


Isn't that an inevitable side effect of collaborative filtering? If companies could do content based-recommendation, wouldn't they? Until purely content based recommendations are possible, wisdom of the crowds via collaborative filtering will lump together videos that are about different things but watched by similar viewers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: