Pangyrus presents another essay in our political column, The Sounding Board. What’s The Sounding Board? It’s politics writ large—and writ well. The makeup of society, the ways we interact, how cities get built, wars started, families reunited, poverty worsened or alleviated. This column will always challenge, always engage—and always be open to new perspectives.
_______________________
A comfortable, fall evening in Paris. A gentle breeze swung the reddening leaves against the backdrop of stars. It was the perfect setting for a night in town. The handwritten menu on the wall offered espressos and fine teas under a yellow canopy that sheltered the sidewalk. Life was good and the bustling bistro made for the perfect Parisian memory to bring back home after a semester abroad for the American exchange student relaxing on the terrace with her friends.
Across town, three men in a small rental car dashed through the streets and came to a screeching halt behind another vehicle. One of the occupants jumped outside, shot, and killed the driver of the stopped car, and proceeded to discharge his automatic weapon at people at the bar across the street. Ignorant of the wailing all around him, the man got back into the car, then drove across the street and attacked a second restaurant. Thirteen people were killed. Then back in the car again, and a few more blocks further, they arrived at the bustling corner with that inviting terrace under the yellow canopy. Eight injured, five killed. Among those killed was Nohemi Gonzalez, a 23-year-old industrial design senior student from California State University, finishing her semester abroad at the Strate School of Design in Paris. Before the night was over, one hundred and thirty were dead, and four hundred and sixteen were injured. The victims came from nineteen countries. Nohemi was the only American. Next day, on 14 November 2015, ISIS claimed responsibility for the crimes by issuing a written statement and releasing a video on YouTube.
***
Ideologies, prejudices, and religious or political beliefs can radicalize individuals to the point that they would give their lives and destroy others’ lives for a cause. Extremist groups can offer a sense of purpose, community, and identity. Someone struggling in life can become prey to psychological manipulation via exposure to violent material and other inappropriate information.
Given that groups are being recruited and organized on online platforms, what role does social media play, and what are such companies responsible for?
These were the questions at the heart of Gonzalez v Google, the lawsuit brought by Nohemi’s family against YouTube for, allegedly, aiding and abetting ISIS to use its platform “to recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations.” The complaint stated that because YouTube (owned by Google) uses algorithms that recommend content based on a user’s viewing history, it assisted ISIS in spreading its message. The Gonzalez case posited that the platform was liable for failing to take meaningful or aggressive action to prevent terrorists from using its services and reaching potential recruits.
Google has defended itself by relying on Section 230 of the Telecommunications Decency Act, passed in 1996, which provides internet platforms with a broad liability shield and immunity from third-party user-generated content. The lower court ruled in favor of Google, the appellate Court upheld the decision, and the family went in front of the Supreme Court which had agreed to review the case.
Imagine abolishing this immunity. It would turn the internet companies’ business models upside down. Platforms would become legally liable for material posted by users, leading them to heavy moderation or removal of such content, and changing the nature of what is hosted on the platform and how users interact with it. This would likely be unaffordable for all but the largest firms.
***
It has been more than seven years since the attack on Tuesday, February 21, 2023, when Nohemi’s mother and stepfather, stand on the steps of the Supreme Court in Washington D.C. in dark clothes that contrast with the white marble of the building. They wear somber looks, their gazes crossing in the distance. Inside, the attorneys for the claimant, the Solicitor General of the United States, and the attorneys for the defendant, Google, get ready to plead their case in front of the nine justices.
During almost three hours of discussions the justices on both sides, liberal and conservative, repeatedly acknowledge how consequential a ruling would be. As the court grapples with the arguments, Justice Elena Kagan finally puts it into words: “Every other industry has to internalize the costs of its conduct. Why is it the tech industry gets a pass?” and acknowledges “We’re a court, we really don’t know these things. These are not, like, the nine greatest experts on the internet. Isn’t that something for the Congress to do, not the Court?”
***
Technology and the marketplace have changed since the 90s when Section 230 was written, and so must our expectations about the role and obligations of online platforms. Social Media, Artificial Intelligence, and ubiquitous mobile devices are part of our lives from the moment we wake up every day. They help us navigate traffic, poll friends for a convenient time together, automate mundane tasks, and let’s not forget that, amazingly, we have an almost infinite amount of knowledge at the tips of our fingers.
But not all of that knowledge is worthy. The path for the dissemination of false or nefarious information is accelerated by algorithmic recommendations. We need to regain control of the technology that surrounds us from bad actors that exploit such technology for their own ends. With the speed at which tech evolves, we need to continue to revisit the balance between free speech, liability protection, cost of content moderation, and minimization of harmful content and disinformation.
***
On Thursday, May 18, 2023, the Supreme Court issued a three-page, unsigned opinion that said the Section 230 issues raised by the case were not ready for a decision. “We therefore decline to address the application of §230 to a complaint that appears to state little, if any, plausible claim for relief,” the Court sent the suit back to the 9th Circuit Court of Appeals for further consideration. Advocates of free speech, including the ACLU, and large internet platforms celebrated the development as a victory. Yet I believe more discussion is still needed.
This decision, or lack thereof, by the court, directs the action back to Congress, where it belongs. The issues previously discussed should be addressed via legislative action creating a new regulatory framework focusing on transparency, fairness, auditability, and due process. Liability protection should be limited, extending the current carve-outs related to federal crimes, sex trafficking, and intellectual property violations. The new construct should include provisions for civil rights violations, targeted harassment, incitement to violence, hate speech, and disinformation. At the same time, it is urgent to accelerate the debate in society to better determine what constitutes acceptable content on major social media platforms.
The Supreme Court appropriately decided not to take on the liability limitations in the existing statutes. The Court correctly limited its action to interpreting existing law, which is clear in shielding internet platforms from liability for user-generated content.
Whether this protection is too broad or not, should be debated by society. Congress, not the Supreme Court, is the appropriate democratic institution with the responsibility for legislating and changing the existing laws governing internet platforms. Importantly, the United States Senate has started a series of hearings with industry, academia, and civil society to shape upcoming proposals to tackle Artificial Intelligence and the ethical use of data. This approach and the growing bipartisan interest in the topic are encouraging and should be applied to governing internet platforms, too.
A new government body with a dedicated focus on effectively regulating social media companies is needed. Protecting consumers and enhancing competition can help maintain innovation and create conditions that allow people to continue to enjoy the benefits of technology.
Nothing will make up for hundred and thirty lives cut short by a group of extremists on that day in Paris. Nohemi is not going to graduate or pursue her dreams as a designer. But her life may still help move tech back onto its rails.
The court has spoken — it is now our turn.
A version of this article was originally published by Harvard’s Social Impact Review.
Image: photo by Marc Groth on Unsplash, licensed under CC 2.0.
- The Sounding Board: The Court Has Spoken - June 30, 2023