top of page
Search

A Change of Tune for Section 230 Reform

  • Writer: juliefarnam
    juliefarnam
  • Sep 16, 2025
  • 4 min read

On Meet the Press recently, Senator Lindsey Graham said something interesting.  He said, “Section 230 needs to be rebuilt.  If you’re mad at social media companies that radicalized our nation, you should be mad and you should be allowed to sue these people and they’re immune from lawsuit.”


This is interesting because it is a stark departure from the position the right has traditionally taken in response to Title 47 United States Code Section 230.  This section of law says at §230(c)(1), “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”  Essentially it shields social media companies from liability when speech posted on their platforms contributes to harm. 


The right has in the past advocated for Section 230 to be changed to limit what content social media companies could remove from their platforms because they believed the platforms would act in a biased manner that would more adversely affect the right than the left.  To put it plainly, the right wanted to be able to post their racist, misogynistic, anti-LGBTQ+, militia content without being censored.  That was what they were fighting for up until last week when they said they wanted Section 230 reform. 


But the right hasn’t been alone in advocating that Section 230 be changed.  The left, too, has been a vocal supporter of reforming this law, albeit for different reasons than the right.  The left wants this section of law to be reformed to allow social media companies to be held liable for content on their platforms.  The left has wanted social media companies to take a more active role in removing mis- and disinformation, and violent and hateful content.  They believe that changing or revoking Section 230, or parts of the law, would force social media companies to be more vigilant in removing inappropriate content if they risked the possibility of being held legally liable for the content they host. 


There have been two recent Supreme Court cases Twitter v. Taamneh and Gonzalez v. Google that considered the issue of what, if any, legal responsibility social media platforms have for the content on their platforms. 


In Twitter v. Taamneh, the family of Nawras Alassah, who was the victim of an ISIS terrorist attack in Istanbul in 2017, sued multiple social media platforms because they claimed the platforms knowingly allowed ISIS and other terrorist groups to use their services.  This they said, indirectly aided terrorist groups and thus violated the Anti-Terrorism Act.  In a nutshell, the lawsuit accused Facebook, Twitter, and YouTube of aiding and abetting terrorism.  The Supreme Court unanimously decided the social media platforms were not liable because they didn’t necessarily knowingly support terrorist activity or promote terrorist propaganda. 


Similarly, in Gonzalez v. Google, the family of Nohemi Gonzalez, who was also killed in an ISIS attack, this time in Paris in 2015, sued Google and more specifically, YouTube (which is owned by Google).  Like in Twitter v. Taamneh, they claimed YouTube violated the Anti-Terrorism Act.  The subtle difference here is that Gonzalez focused on the algorithmic recommendations of the platform—namely it promoted terrorist content—and whether that constituted a liability under the Anti-Terrorism Act.  This case, in light of the Taamneh decision, was returned to the lower court and the Supreme Court did not issue a ruling.


Supporters of Section 230 on the right have up until very recently believed judicial or legislative intervention to change or limit the scope of the law would create a “risk of over-censorship.”   This sentiment was shared in President Trump’s May 2020 Executive Order where he directed the Federal Communications Commission to engage in rulemaking to ensure “[Section 230] is not distorted to provide liability protection for online platforms that — far from acting in ‘good faith’ to remove objectionable content — instead engage in deceptive or pretextual actions (often contrary to their stated terms of service) to stifle viewpoints with which they disagree.”  The Executive Order was advocating for platforms to be held liable when they removed content in a manner that the administration deemed to be biased. The President was advocating for reform of Section 230, but reform it to make it more difficult for social media platforms to institute strict moderation policies that could have content removed.


In light of the assassination of Charlie Kirk, which I wrote about last week, now the right has changed their tune when it comes to Section 230.  All of a sudden, they want social media platforms to now be held liable for promoting violence, but don’t see this as a kumbaya moment between the Democrats and Republicans where they may actually agree on something.  Make no mistake, this is only an interest of the right now because they are concerned about those on the far-left posting violent or hateful content and they want social media companies to be held liable when someone on the left does something unlawful. 


It should go without saying, but I unfortunately have to say it, regardless of the underlying ideology, when someone promotes violence, destruction of property, or other unlawful acts online, that content should be removed.  Violence and hate shouldn’t have a safe haven online.  I will defend anyone’s right to free speech, but that doesn’t mean social media needs to endorse that content by hosting it on their platforms.


Determining when a social media company should be held liable, if at all, for harmful content and which online material would rise to the level of liability is for the courts to decide, or for Congress to amend the statute.  But as we see a propagation of social media platforms that knowingly host inciting, unlawful, and violent content, it is more difficult to argue these platforms are ignorant of what is being posted and oblivious to the damages it causes.  There must be accountability. 


Subscribe

 
 
 

Comments


© 2024 by Julie Farnam. Powered and secured by Wix

bottom of page