A U.S. Supreme Court lawsuit that is challenging the liability protection social media corporations have for hosting the content published by others is centered on the 2015 murder of a young woman in Paris by ISIS militants.
Gonzalez v. Google concerns Nohemi Gonzalez, a 23-year-old college exchange student who was killed, and asks if Google, which owns YouTube, assisted ISIS in recruiting by promoting particular video using YouTube’s algorithms. American law prohibits supporting terrorists, but Google argues that the Communications Decency Act’s Section 230 shields it from liability for movies that are recommended by its recommendation algorithms.
On Tuesday, the justices of the Supreme Court had trouble understanding the arguments made in the case.
“This is a court. We actually don’t know anything about these things,” the justice said. “These aren’t exactly the top nine online experts, you know.”
“Isn’t it better… to put the responsibility on Congress to amend that, and they may weigh the ramifications and make these predictive judgments,” Justice Brett Kavanaugh remarked.
The U.S. Court of Appeals for the 9th Circuit, that had ruled in favor of Google, should review the assertion that algorithms are neutral, Justice Neil Gorsuch stated.
Eric Schnapper, a law professor who is the Gonzalez family’s attorney, stated that Section 230 of the Communications Decency Act distinguishes between claims that hold internet firms accountable for information created by others and claims holding internet corporations accountable for their own acts.
In response, Lisa S. Blatt, a representative for Google, asserted that “helping consumers identify the proverbial needle inside the haystack is an essential necessity on the internet. As a result, search engines customize what users see based on user information. As do innumerable video, news, music, job-finding, social media, and dating websites as well as Amazon, Tripadvisor, Wikipedia, Yelp, Zillow, and others.”
Blatt answered Justice Amy Coney Barrett, who questioned if websites like YouTube would be shielded if their filtering system was “truly libelous or pro-ISIS” rather than impartial. Blatt asserted that the platforms would be safeguarded by Section 230.
Topic headings and the “trending now” tags, for example, which Blatt asserted were “inherent” in publishing, according to Schnapper, are not.
According to Section 230, YouTube portrays itself as a public forum that is not liable for the content posted on its website. Public forums must also support “a true diversity for political discourse,” according to Section 230.
Ajit Pai, the chairman of the FCC, announced in mid-October 2020 that he intended to define what Section 230 should imply.
“The Commission has received a request from the U.S. Dept. of Commerce to “clarify uncertainties in section 230. What does Section 230 now mean is the question that needs to be answered as political leaders decide whether to modify the legislation.” Pat composed. “Many make an unduly broad interpretation of Section 230 that, in some situations, shields social media corporations from consumer protection legislation in a way that is not supported by the text. Social media corporations are entitled to free speech under the First Amendment. However, they are not entitled under the First Amendment to a special exemption that newspapers and broadcasters are denied.”