- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.
I have a feeling no one here ever ran a website in their life
Idk about this suit but let’s not forget how Facebook did actually in fact get a fascist elected president.
https://www.wired.com/2016/11/facebook-won-trump-election-not-just-fake-news/
He was treated like a joke candidate by the Democrats at the time. Facebook didn’t get him elected, Hillary ran a weak campaign and didn’t take the threat seriously. He used FB for fundraising and she could’ve done the same thing if she wanted to.
The village (community/lack of community) makes the villains. Everyone’s a problem. We are all to blame.
That. Or, just spitballing here, it’s the guns.
Not saying something shouldn’t be done or not done with the gun situation. But I believe it’s the community driving these kids to want a gun to kill people. Gun laws are just one part of many problems that are a part of our broken community. I guess the guns are a result of a broken community is part of what I mean. Banning guns alone in my eyes is an extremely over simplified bandaid fix. Tbh these days I see the gun debate as crooked politics just trying to get votes… They want that free publicity.
Edit: a politician is never going to speak negatively about the general community. They can’t, it would kill their career. I think that’s a big problem in why nothing changes. Politics is money and business it’s like gang life for white collars
I’m not even sure what you mean when you say community, or who’s part of it.
No problem. It’s hard to talk about this stuff with out generalizing. I’m at work I can’t really get into it.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
This seems like the only part of the suits that might have traction. All the other bits seem easy to dismiss. That’s not a statement on whether others share responsibility, only on what seems legally actionable in the US.
Here’s an install video of what I assume was the product in question based on the named LLC. https://youtu.be/EjJdMfuH9q4
Shy of completely destroying the the lock and catch system by drilling the mechanism I don’t see an effective way of removing it.
I don’t think it’d meet the court’s standards for easily removable given it’d require power tools and would permanently alter the device in an unfamiliar reversible way.
Here is an alternative Piped link(s): https://piped.video/EjJdMfuH9q4
https://piped.video/EjJdMfuH9q4
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
- RMA Armament is named for providing the body armor Gendron wore during the shooting.
No he bought it.
- Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.
Not their issue he passed the background check.
- The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.
- YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
This is just absurd.
My guess is they are hoping for settlements vs going to trial where they lose.
Next they will announce that they are suing Disney because he watched the History Channel, and that had violence on it which contributed to his actions.
Only responding to the last point, but if they can prove that Google somehow curated his content to push him towards fringe, terroristic websites, they could be found liable as a civil suit.
Any basic “you may like this” algorithm can produce those results.
deleted by creator
That is legitimately a problem.
For some reason, YouTube’s algorithm heavily favors extremist content if you show even a casual interest in related material.
It’s probably as simple as “shocking content gets more clicks”, but still, it’s not good for our society to have entertainment platforms recommending extremist views.
In the old days, you’d have to seek out this kind of fringe content on your own. And you’d get pushback from your community if you started talking nonsense.
Nowadays, my aunt is getting blasted with reptilian democrat stuff after showing an interest in typical conservative lady content years ago. And there is not much of a community left to help her out. The algorithms just amplify all the worst shit.
Oh you watch WWII videos because you like hearing about how liberal democracy stomped fascism with superior tactics, weapons and intelligence?
Here’s some videos by actual fascists! Women are the patriarchy!
Oh you like videos about Cold War Russia and espionage?
How about this video about why Ukraine is run by Jewish paedophile Nazis?
Say what you want about youtube and reddit but if you want them to censor more and more you are creating a sword that can be used against you too. I also don’t like the idea of shooting the messenger no matter how much we may dislike the messages. When I hear lawsuits like this I always think it is greedy lawyers pushing people to sue because they see deep pockets.
and with hold sites like youtube accountable I am living a gun that can shoot me. Its a double edge sword that can be used to hurt me no matter what we do
Right, so then they should be operated as a public telecom and be regulated as Title II. This would allow them to be free from such lawsuits.
However, they want to remain as private for profit companies so they should be held responsible for not acting responsibly.
It doesn’t make sense to treat websites as utilities. Net neutrality can’t be applied to websites, it would make most basic spam filtering infeasible and blow up operational costs
You’re right. I was wrong. There is a big difference between websites and ISPs, and in my eagerness to respond I skipped that basic understanding.
I feel like their should be basic policing of the most horrific things, e.g. child porn. But you’re right, it’s impossible to filter everything out in a timely manner by websites.
Last I heard they’re already covered under Safe Harbor laws and are protected.
US federal law CDA section 230
https://www.law.cornell.edu/uscode/text/47/230
Section ‘C’.
I agree
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.
Yeah this is going nowhere.
deleted by creator
Fantastic. I’ve been waiting to see these cases.
Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that’s a viable claim for wrongful death. It’s about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.
Really a lot of people coming in on this thread who obviously have no legal experience.
Like you
a viable claim for wrongful death
Something tells me you’re not a lawyer.
Something tells me you’re wrong and not a lawyer.
Does remindmebot exist on Lemmy? I’d be very interested in a friendly wager.
Loser has to post a pic in a silly shirt!
I don’t know but I’m 3 for 3 on these.
Bet that Supreme Court would uphold ATF interpretation on bump stock ban. That appeals courts would find a violation of 1A where Trump and other political figures blocked constituents on social media. And I bet that Remington was going to be found liable in the Sandy Hook lawsuit on a theory not wholly dissimilar from the one we’re talking about here. I’m pretty good at novel theories of liability.
What silly shirt will you wear?
Mine will say “I’m a T-Rex stuck in a woman’s body”
I am not, in fact, a woman. It’s a hoot.
Mine will say “Novel theories of civil liability are not my bag, baby!”
In fact they are.
It’s a date! No remindmebot but I’ll bookmark it.
I just don’t understand how hosting a platform to allow people to talk would make you liable since you’re not the one responsible for the speech itself.
Tell that to the admins of lemmy.world defederating from communities because they may be held liable for what shows up on their website.
You mean the cowards who are already operating in a safe-habor provision of the DMCA?
Sure? I mean I think so. 🤔
We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.
They set the culture.
Did reddit know people were being radicalized toward violence on their site and did they sufficiently act to protect foreseeable victims of such radicalization?
Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.
Reddit is going to have to make the argument that it just boosts “what people like” and it just so happens people like negative engagement.
And I mean it’s been known for decades that people like bad news more than good news when it comes to attention and engagement.
They probably will take that argument but that doesn’t instantly dissolve them of legal culpability.
Is that really all they do though? That’s what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn’t even be possible to start on DIY videos and end on white supremacy or whatever.
I wrote a longer version of this argument here, if you’re curious.
Which article is it? The link takes me to the website main page.
Huh really? Do you have JS turned off or anything? Here’s the full link: https://theluddite.org/#!post/section-230
Hmm not sure. I use a client called Memmy for browsing Lemmy. Copy and pasting the link in my browser worked. Thanks!
I bet memmy cuts off the URL at the “#!” for some reason. I’ll submit a bug report to their repo.
This is a good read, I highly suggest people click the link. Although it is short enough that I think you could have just posted it into your comment.
Yes, but then I couldn’t harvest all your sweet data.
Kidding! It’s a static site on my personal server that doesn’t load anything but the content itself. It’s mostly just a PITA to reformat it all mobile.
I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.
Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn’t matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.
They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.
I’m not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.
The catch is whether the site knows that specific individual is being radicalized. If admins aren’t punishing the account regularly I wonder how difficult it will be to prove reddit/YT specifically pushed this guy.
This is the best summary I could come up with:
YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.
The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.
The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.
YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.
“We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.
Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.
I’m a bot and I’m open source!
This is a step in the right direction. YouTube and especially Reddit, have some of the most awful moderations in practice to date. People online in general, are some of the poorest role models to be looking up to. They’ll encourage and stoke anyone to do anything. Whether it’s suicide, whether to perform dumb harmful pranks, just anything.
I don’t agree with the local gunshop. The gun store owner couldn’t have known that any gun he’d sell would be used within moments, to take innocent lives. The gunman’s parents? Maaaaaybe a little insight into upbringing and examine that until it’s exhausted before we judge there.
The gun store owner couldn’t have known that any gun he’d sell would be used within moments, to take innocent lives.
Hundreds, thousands of deaths due to gun violence committed right after the gun was bought would disagree with you
Most gun violence involves weapons that are less than 6 months old.
Did the gun owner tell gun purchasers to kill?
Did the gun start having voices of it’s own to tell people to kill?
You figure that out.
No, the gun owner pulled that trigger. Take away the gun and he can’t pull that trigger anymore.
It’s not that hard, and mental gymnastics won’t help your cause
What is a gun for again? Shooting and… c’mon. You know the answer.
You think very black and white. I’m not entertaining that.
You don’t know how I think, this is our first interaction ever. How can you know what I think? Why are you not able to tell me what guns are for? They are for killing. Either hunting animals or shooting people.
Are there any other uses for guns besides that? What, target practice for funsies? Where you… shoot a silhouette of a human? (I’m sure some places just have a target instead but what else would you be practicing shooting for?)
deleted by creator
Yup, definitely the websites fault. Nothing more to it.
FTFY.
"YouTube, Reddit and a body armour manufacturer were among the businesses that helped enable the gunman who killed 10
Blackpeople in a nracistattack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.removed by mod
Fuck reddit but thats bs.