cross-posted from: https://lemmy.world/post/2044512
I made sure to remove cookies and not sign in so I think these are the base suggestions made by youtube.
Google is currently defending itself in the US Supreme Court over a lawsuit that alleges they assisted the terrorist group ISIS in recruiting members after it was found the YouTube algorithm promoted ISIS recruiting videos to young men who later committed a terrorist attack.
So to answer your question using Google’s argument: they have so many videos that an advanced search feature is required to make the site usable. Their search feature only suggests things that are popular. It’s not their fault ISIS recruitment (or other violent content) videos are popular.
The counter argument is: Google is curating content by displaying things people didn’t search out themselves. This is direct promotion by Google itself and therefor it should be treated as if they are the publisher of that content. Anyone publishing violent content should be held liable for it.
Capitalism
Yeah, what about it?
I guess it’s just often used terms in search?
Or some kind of text prediction (e.g. simple Markov chains or something more advanced) that just “thinks” that this fits?
deleted by creator