The future of speech on the Internet is in the hands of the US Supreme Court.
On Tuesday and Wednesday, the Supreme Court is scheduled to hear oral arguments in two high-profile cases involving Google, Twitter and Facebook that could reshape how people use the internet and what they can post online. Both cases stem from Lawsuits It was brought in by relatives of people killed in the terrorist attacks, claiming that social media companies are responsible for the harmful content that appears on their platforms.
There are questions about whether these online platforms should be held legally responsible for content generated by their users but promoted by corporate algorithms. Technology companies have successfully resisted these types of lawsuits because of the protection they receive under 27-year-old federal law.
But lawmakers on both sides of the aisle including him US President Joe Bidento make changes to what is known as Section 230 Due to growing concerns, technology companies are not doing enough to protect user safety. Technology companies say removing this legal shield could harm freedom of expression because they could be exposed to more lawsuits.
Technology platforms give people the ability to talk to others online, said Eric Goldman, a professor at Santa Clara University School of Law. That could go away depending on what the Supreme Court decides.
said Goldman, who wrote abbreviated Support for Section 230 protection. “It sticks with all of us.” He added that companies can limit who can post on their platforms or get rid of user-generated content.
Here’s what you need to know about this high-stakes battle over online speech:
What is section 230?
Section 230 is part of 1996 Communications etiquette law, which protects platforms, including Google, Meta, Facebook, Twitter, and other services from certain lawsuits related to user-generated posts. It also allows these platforms to take action against offensive content.
the livelihood It states that no “interactive computer service” provider or user should be treated as a publisher of Third Party Content.
The co-authors of Section 230 — U.S. Sen. Ron Wyden, D-Oregon, and former Rep. Chris Cox, R-Calif. — told the Supreme Court in abbreviated Congress created it to “protect the ability of Internet platforms to publish and deliver user-generated content in real time, and to encourage them to screen for and remove illegal or offensive content.” Even then, online services were facing Lawsuits on user content. In 1995, for example, the New York Supreme Court ruled that Internet message board platform Prodigy Services could be liable for posting allegedly defamatory content.
Section 230 does not apply to content that violates criminal, intellectual property, state, communications privacy, and sex trafficking laws.
Why should I care?
Section 230 is designed to encourage freedom of expression on the Internet. But a Supreme Court ruling on this issue could change how you use the Internet and what you can post online. If an online platform is worried about more lawsuits, it could change how it modifies content and possibly increase scrutiny of what it says.
“Without the protections of Section 230, many online intermediaries will heavily filter and censor user speech, while others may not host user content at all,” the Electronic Frontier Foundation said in a press release. blog post about the topic.
What cases does the Supreme Court hear?
The Supreme Court is hearing two cases involving Internet speech: Gonzales v. Google and Twitter v. Insurance.
Gonzalez v. Google, which will be heard on Tuesday, will discuss whether Section 230 protects online platforms including social networks from lawsuits when they recommend third-party content. The case stems from a lawsuit brought by the family of Nohemi Gonzalez, a 23-year-old American student who was killed in the 2015 terrorist attacks in Paris. the family He claimed that Google-owned YouTube helped ISIS terrorists because the video-sharing platform allowed them to post videos inciting violence and recruiting supporters. The lawsuit also accuses YouTube of recommending ISIS videos to users.
A district court and the US Court of Appeals for the Ninth Circuit ruled in favor of Google, dismissing Gonzales’ claims.
In the case of Twitter v. Taamneh Wednesday, the Supreme Court is considering whether people can sue online platforms for aiding and abetting an act of terrorism. The case relates to the death in 2017 of Nawras Al-Assaf, a Jordanian citizen who was fatally shot in an Istanbul nightclub during a mass shooting. ISIS claimed responsibility for the attack. Assaf’s relatives He sued Twitter, Google and Facebook, alleging that the platforms are liable under anti-terrorism law for aiding and abetting terrorism because the companies have not done enough to combat such harmful content.
A district court dismissed the claims in the suit, but the United States Court of Appeals for the Ninth Circuit reversed the decision.
How have tech companies responded?
Google says the Supreme Court’s decision on Section 230 could “radically change the way Americans use the Internet.”
Google said in mail about the case.
If a platform can be sued for the content it recommends, consumers may have a more difficult time finding content they want to watch. The tech giant also says removing Section 230 protections would make the internet less safe, hurt both online platforms large and small, and cause websites to restrict more content or close some services due to legal risks.
“Congress has been clear that Section 230 protects the ability of online services to regulate content,” Google General Counsel Halima Deline Prado said in a statement. “The erosion of this protection would fundamentally change how the Internet works, making it less open, less secure, and less useful.”
Other tech companies, including Reddit, Yelp, Microsoft and Meta, have also defended Section 230 protections in submissions to the court.
Jennifer Newsted, Meta’s chief legal officer, said: blog post about the topic.
Reddit said on file abbreviated That users become more cautious about volunteering to moderate content on its platform or recommending content through actions such as “upvoting” due to legal risks.
in Twitter v. Taamneh, Twitter said it did not aid or abet a terrorist act because the company did not intend to aid terrorists, had rules against publishing terrorist content and was not related to the terrorist attack in Turkey. Support for Facebook and YouTube owned by Google Twitter on file abbreviatednoting that the Court of Appeals ruling on the anti-terrorism law is “incorrect” and could lead to more lawsuits against any provider of goods or services such as airlines, financial service providers and pharmaceutical businesses that are abused by terrorists.
Twitter, which no longer has a communications department, did not respond to a request for comment.
What do US legislators think about this?
Democrats and RepublicansSurprisingly, you would agree that reforms to Section 230 are needed. But their motivations are in stark contrast to one another.
Republicans accuse Big Tech of suppressing conservative voices, with US House Judiciary Committee Chairman Jim Jordan speaking last week Issuance of subpoenas To the CEOs of parent company Alphabet, Amazon, Apple, Meta and Microsoft.
Democrats argue that Section 230 prevents social media companies from being held accountable for failing to moderate hate speech, misinformation and other offensive content.
Biden wrote in a letter editorial Published in the Wall Street Journal in January.
what happened after that?
The Supreme Court is expected to decide the cases this year. The court is asked to review other cases involving online speech. In January, she held off on saying whether she would hear cases Controversial laws Passed in Texas and Florida which restrict how social media companies can edit content.
ليست هناك تعليقات:
إرسال تعليق