- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
It use to be video games and movies taking the blame. Now it’s websites. When are we going to decide that people are just bat shit crazy and guns need some form of regulation?
Because every gun owner thinks they are “the good guys”
Usually from their perspective they are. Most people don’t try to be bad.
I can see the nuance in an argument that an online community, unmoderated, could be using an algorithm to group these violent people together and amplifying their views. The same can’t really be said for most other platforms. Writing threats of violence should still be taken seriously over the internet, especially if it was later acted upon. I don’t disagree with you that there’s a lot of bat shit crazy out there though.
It is harder to get a nail salon license in many states than to accumulate an arsenal.
I don’t know man, sounds a bit too much like sense to me.
but muh rights to go pew pew!
/s just in case not clear…
It’s not popular nowadays to mention that people need to have self accountability, there’s always apparently a website, service, game or social media platform to “blame” for the actions of the individual
How is self accountability incompatible with systemic issues?
Guns have more legislation written about them than nearly any other product. They are heavily regulated. They are not effectively regulated however.
This ineffectiveness is directly due to NRA lobbying, and their zero-tolerance attitude towards any new gun legislation. Any gun-friendly lawmaker who even gets close to writing gun control legislation will end up getting harassed (and likely primaried in the next election). So when gun control legislation passes, it’s inevitably written by people who don’t understand guns at all. No wonder it’s all shit!
Maybe now that the NRA is having financial difficulties legislators will have make leeway to enact things that might have a chance of working.
The thing about bat shit crazy people is that they dont need guns to be violent, they will find another way.
Fantastic. I’ve been waiting to see these cases.
Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that’s a viable claim for wrongful death. It’s about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.
I just don’t understand how hosting a platform to allow people to talk would make you liable since you’re not the one responsible for the speech itself.
Is that really all they do though? That’s what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn’t even be possible to start on DIY videos and end on white supremacy or whatever.
I wrote a longer version of this argument here, if you’re curious.
Which article is it? The link takes me to the website main page.
Huh really? Do you have JS turned off or anything? Here’s the full link: https://theluddite.org/#!post/section-230
I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.
Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn’t matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.
They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.
I’m not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.
We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.
They set the culture.
Did reddit know people were being radicalized toward violence on their site and did they sufficiently act to protect foreseeable victims of such radicalization?
a viable claim for wrongful death
Something tells me you’re not a lawyer.
Something tells me you’re wrong and not a lawyer.
Really a lot of people coming in on this thread who obviously have no legal experience.
Like you
Say what you want about youtube and reddit but if you want them to censor more and more you are creating a sword that can be used against you too. I also don’t like the idea of shooting the messenger no matter how much we may dislike the messages. When I hear lawsuits like this I always think it is greedy lawyers pushing people to sue because they see deep pockets.
Right, so then they should be operated as a public telecom and be regulated as Title II. This would allow them to be free from such lawsuits.
However, they want to remain as private for profit companies so they should be held responsible for not acting responsibly.
It doesn’t make sense to treat websites as utilities. Net neutrality can’t be applied to websites, it would make most basic spam filtering infeasible and blow up operational costs
You’re right. I was wrong. There is a big difference between websites and ISPs, and in my eagerness to respond I skipped that basic understanding.
I feel like their should be basic policing of the most horrific things, e.g. child porn. But you’re right, it’s impossible to filter everything out in a timely manner by websites.
I agree
Last I heard they’re already covered under Safe Harbor laws and are protected.
US federal law CDA section 230
https://www.law.cornell.edu/uscode/text/47/230
Section ‘C’.
and with hold sites like youtube accountable I am living a gun that can shoot me. Its a double edge sword that can be used to hurt me no matter what we do
The algorithm feeds on fear and breeds anger. This much is true.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
Yeah this is going nowhere.
- RMA Armament is named for providing the body armor Gendron wore during the shooting.
No he bought it.
- Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.
Not their issue he passed the background check.
- The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.
- YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
This is just absurd.
My guess is they are hoping for settlements vs going to trial where they lose.
Only responding to the last point, but if they can prove that Google somehow curated his content to push him towards fringe, terroristic websites, they could be found liable as a civil suit.
Any basic “you may like this” algorithm can produce those results.
deleted by creator
Oh you watch WWII videos because you like hearing about how liberal democracy stomped fascism with superior tactics, weapons and intelligence?
Here’s some videos by actual fascists! Women are the patriarchy!
Oh you like videos about Cold War Russia and espionage?
How about this video about why Ukraine is run by Jewish paedophile Nazis?
deleted by creator
This is really, really stupid.
interesting… whether the sites will be found liable…. it’s pretty unlikely, but it sure does shine a spotlight on how each are magnets for alt-right crazies. I wonder if that will have any effect on their moderation?
I doubt it.
They’re also “magnets” for progressive, liberal, conservative and all other crazies and normal people. That’s mostly because everyone uses them. It’s the most popular video sharing site and (one of?) the most popular social media site.
yeah, but progressives and liberals and all other “crazies and normal people” aren’t the ones committing mass shootings all the time.
Right, but since YouTube and Facebook are two of the most popular sites in the world, they aren’t really just magnets for alt-right crazies, since they appeal to almost everybody.
right, but “everybody” aren’t the ones committing mass shootings all the time. that’s an alt-right crazies problem.
I didn’t say they were. Facebook and YouTube didn’t commit the shootings, and there isn’t anything particularly special about them that would disproportionately attract the alt-right crazies. They’re not hate sites.
here isn’t anything particularly special about them that would disproportionately attract the alt-right crazies
lmao… that’s a good one
YouTube’s algorithm seems to be funneling people to alt-right videos
Feeding Hate With Video: A Former Alt-Right YouTuber Explains His Methods
‘Carol’s Journey’: What Facebook knew about how it radicalized users
‘It let white supremacists organize’: the toxic legacy of Facebook’s Groups
this is just scratching the surface…
a great video essay the subject:
The Trump supporters like to bitch that Facebook has been censoring their opinions, especially during 2020 and 2021. They felt the same way about Twitter until Elon turned it into a hell hole.
They aren’t being sued for being “magnets.”
💀
Ahh one of those “We’re mad and we don’t have anyone to be angry with.” style lawsuits. Pretty much the Hail Mary from a lawyer who is getting their name in the paper but knows it won’t go anywhere.
“Easy to remove gun lock” that has been tried multiple times and usually fails. “Gun lock” doesn’t seem to be related to assault weapons and large capacity magazine but who knows what they mean, even when a gun is “Easily modifiable” it’s usually not treated as illegal, because someone has to actually make those modifications. The same will probably be the case for the kevlar. (at the time of the shooting it was legal).
Youtube contributing to radicalization is a laugh, it’s an attempt to get their name in the papers and will be dismissed easily. They’d have better chance to name the channels that radicalized him, but first amendment rights would be near absolute here. Besides which “Radicalization” isn’t the same as a conspiracy or orders. It’s the difference between someone riling up the crowd until they’re in a fervor which ends up in a riot, and someone specifically telling people how to riot and who to target. (Even if can be tried as crimes, one is a conspiracy, one is not, and even that “radicalization” would be neither.) Even “I wish someone would go shoot up …” would be hyperbole, and thrown out as well. It’s pretty hard to break the first amendment protections in America (And that’s a good thing, if you think it’s not imagine if the other party is in power and wants to squash your speech… yeah let’s keep that amendment in place).
The same will be the case against Facebook for all the same reasons.
If you think Google should be responsible, then you think the park that someone is radicalized in should be responsible for what’s said in it, or the email provider is responsible for every single piece of mail that is sent on it, even though it might not have access to see that mail… it’s a silly idea even assuming they could even do that. Maybe they’re hoping to scare Google to change it’s algorithm, but I doubt that will happen either.
The case against the parents is another one that people try and again… unless there’s more than their saying, you still can’t sue someone for being a bad parent. Hell there’s a better case against the parents of Ethan Crumbley, and even that cases is still pretty shaky, and involved the parents actively ignoring every warning sign, and buying the kid the gun. This there’s nothing that seems to be pinnable on the parents.
You know it sucks and I know there’s a lot of hurt people but lawsuits like this ultimately fail because it’s like rolling the dice, but history pretty much shows this is hoping for a one in a million chance that they get lucky, and they won’t, because it’s one in a million, and then they’d have to hope it’s not overturned even if they do win.
You don’t know what you’re talking about and it’s obvious.
You’re not a lawyer, right?
I have a feeling no one here ever ran a website in their life
I used to think censorship worked. Now I think that just encourages troubled individuals to find an even worse echo chamber somewhere on the internet.
I don’t know what the right answer is regarding some of the parties in these lawsuits, I just see more and more stuff get censored and it never seems to get any better.
It’s like spraying insect repellent, it just pushed the roaches to hide deeper into your house and spread around.
deleted by creator
The gun store owner couldn’t have known that any gun he’d sell would be used within moments, to take innocent lives.
Hundreds, thousands of deaths due to gun violence committed right after the gun was bought would disagree with you
Idk about this suit but let’s not forget how Facebook did actually in fact get a fascist elected president.
https://www.wired.com/2016/11/facebook-won-trump-election-not-just-fake-news/
He was treated like a joke candidate by the Democrats at the time. Facebook didn’t get him elected, Hillary ran a weak campaign and didn’t take the threat seriously. He used FB for fundraising and she could’ve done the same thing if she wanted to.
Can’t see how the lawsuit on the tech giants gets passed Section 230, which is unfortunate as Spez and the people who run Youtube willfully helped enable and encourage this shooter.