In a presidency that has exacerbated extreme political polarization, Republicans and Democrats have found common ground in their calls for a review of Section 230 of the Communications Decency Act but the agreement ends there. The 1996 federal law both protects most social media platforms from liability of the content posted by their users, and empowers the platforms to moderate “objectionable” user-created content without the risk of being designated a publisher. In his Executive Order on Preventing Online Censorship, President Donald Trump W68 has joined a group of conservative voices including Senator Ted Cruz (R-TX) and Senator Josh Hawley (R-MO); they claim tech platforms should be regulated due to “selective censorship” that reflects political bias. Presidential candidate Joe Biden believes the law should be revoked entirely, and many Democrats have also called for revisions to Section 230 on the grounds that these platforms are not doing enough to combat disinformation and protect users. Whichever camp you belong to, it is evident that Section 230 demands review and revision. This is no better understood than by victims of nonconsensual pornography a crime that activist and professor at the University of Miami School of Law Mary Anne Franks defines as “sexually explicit images and video disclosed without consent and for no legitimate purpose.” Section 230 must be rewritten in such a way that increases protection for victims of non-consensual pornography, which is achieved by calling for more content moderation, not less.

Democratic legislators have made their critiques of Section 230 in the name of protecting user safety, and assert platforms are not sufficiently utilizing their “protection for “Good Samaritan” blocking and screening of offensive material.” These platforms, enjoying immunity from the content of their users, have allowed nonconsensual pornography, an invasion of sexual privacy often denoted by the media as “revenge porn,” to remain online despite victim’s requests for removal. Although most social media sites like Facebook prohibit nonconsensual pornography in their policies, their platforms can still be used to distribute intimate photos like in the case of Marines United, a Facebook group of 30,000 individuals that shared hundreds of naked images of active female service members and veterans. Liberals recommending  revisions to Section 230 would see platforms taking a more proactive approach to user complaints of instances of nonconsensual pornography. Ultimately, this would encourage more content moderation and could potentially provide victims with civil recourse if platforms are notified and choose not to take action.

This issue is not immaterial; the Pew Research Center reports that 10 percent of women ages 18-29 and 12 percent of Americans ages 18-29 have had explicit pictures of themselves shared without their consent. This staggering number is only poised to increase as “sexting” sending nude and/or partially nude images or videos through cell phones has become a more common practice in young adult relationships with nearly half of Americans ages 18-26 having sent sexually explicit images of themselves to others. Furthermore, the adverse effects felt by victims of nonconsensual pornography can be wide ranging and greatly destructive to their personal and professional lives as well as their mental health and physical wellbeing. Consider the case of Matthew Herrick, who was harassed for months after his ex shared nude images of him on the app Grindr and solicited over a thousand sexual encounters. In his lawsuit against the social media app, Herrick details having submitted over fifty complaints to no avail. The courts dismissed Herrick’s lawsuit due to the immunity granted to Grindr. A revision to Section 230 that brings about more content moderation under good samaritan blocking would aid in bringing justice to victims like Herrick.

President Trump’s calls for less content moderation will make it increasingly difficult for victims of nonconsensual pornography. His executive order creates a tension between demands for platforms to end content moderation and the cries from victims of nonconsensual pornography for platforms to remove damaging and private photos of them. Unfortunately, despite several attempts by Congresswoman Jackie Speier (D-CA), Congress has yet to create federal protection for victims of nonconsensual pornography. In 2016, Speier introduced the Intimate Privacy Protection Act (IPPA), which criminalizes the distribution of sexually explicit images of a person with reckless disregard for the person’s lack of consent. Speier later reintroduced the bill in 2017 as the Ending Nonconsensual Online User Graphic Harassment (ENOUGH) Act. Most recently Speier was joined by Senator Kamala Harris (D-CA) in 2019 to introduce the Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act. Unable to pass federal legislation prohibiting nonconsensual pornography, tackling Section 230 could be the best opportunity to finally aid victims who have had their sexual privacy violated.

Without federal legislation, some private market tech firms have actually decided to use their Section 230 protections in order to support victims. Facebook decided to ban nonconsensual pornography in their terms of service, has taken down images at user’s requests, and even created a technique called “hashing” in order to prevent reposting. Google has also made moves to protect victims in a decision to remove nudes photographs at users’ request. Speaking to the policy, Google finds it “similar to how we treat removal requests for other highly sensitive personal information, such as bank account numbers and signatures.” However, not all platforms that receive Section 230 protections have been as outspoken and supportive of victims. If a victim finds their image posted on a website that does not accept user requests to take down the picture, they are left with limited options. In order to protect citizens’ right to privacy, our legislators should call for tech platforms to employ their good samaritan liability shield more, not less.

President Trump’s Executive Order on Preventing Online Censorship is unlikely to pass constitutional muster for numerous reasons. To begin, the First Amendment protects private companies’ speech from government censorship, not vice versa. Furthermore, the executive order ignores judicial precedent; changes made to Section 230 would need to be driven through Congress. Addressing these concerns, the Center for Democracy and Technology (CDT) filed a lawsuit against the executive order on June 2nd. The internet advocacy group contends the order violates the First Amendment both in its retaliatory nature against Twitter and for “seek[ing] to curtail and chill the constitutionally protected speech of all online platforms and individuals— by demonstrating the willingness to use government authority to retaliate against those who criticize the government.” Twitter responded to the lawsuit by thanking the CDT for challenging the executive order, which it believes would undermine online speech. In defense of the executive order, Senator Cruz argued on Twitter that Section 230 was created by Congress to protect “neutral” platforms. Despite this assertion, Senator Ron Wyden (D-OR) who co-authored the law that created Section 230 states: “Section 230 is not about neutrality.

While President Trump’s reasons for and methodology in criticizing Section 230 are flawed, it is without a doubt that the law is outdated. Calls to limit content moderation will only worsen the harm done to victims of nonconsensual pornography; Section 230 must be updated to encourage and require more content moderation in order to protect user safety. It is critical that President Trump considers those experiencing the most material harm from Section 230 as he guides the national discourse around the subject. Americans deserve the right to their sexual privacy, and companies have an obligation to ensure their platforms aren’t being used to promote the dissemination of damaging images.

 

Tyler Dullinger recently graduated Summa Cum Laude from the Wharton School with a B.S. in Economics, concentrating in Finance and Management. Tyler hopes to forge a career in public policy and codify Americans’ cyber civil rights into law.