•  
  •  
 
Washington Journal of Law, Technology & Arts

Abstract

Social media platforms, once simple messaging boards, have grown to colossal size. They are now a vital source of communication and connection, particularly for marginalized groups such as the LGBTQ+ community. Social media holds incredible sway over the news, political discourse, and entertainment that we consume, and the platforms we use are now able to sculpt conversations simply by allowing or disallowing (i.e., moderating) specific types of speech or content.

One indirect form of moderation is demonetization, a means by which content creators are disallowed revenue from advertisements on their hosted media. The consequence of improper demonetization is not just financial: demonetized content is also deprioritized and, in a sea of competing media, often overlooked or in some cases entirely hidden. This process effectively removes demonetized voices from the broader conversation, which is precisely what happened to a list of LGBTQ+ creators on YouTube starting in 2017. Those creators’ voices were—seemingly unintentionally—silenced, as an algorithm inadvertently flagged their content as “adult” or “sexually suggestive.” The creators lost following and revenue, and YouTube as a host of online content faced no consequences for the error, thanks to the protections afforded it by Section 230 of the Communications Decency Act of 1996. Section 230 has been treated as a shield for online platforms, as well as a sword enabling those platforms to moderate content as they see fit (with several restrictions).

Moderation is necessary and important in the broadest sense. However, modern platforms, being a far cry from the messaging boards of the late 1990s in practically every sense, must be held to higher account for the means by which they undertake that moderation. This paper suggests a set of simple amendments to Section 230 that would allow for monetized content creators whose content had been inappropriately flagged and demonetized to a) have that content remonetized and b) to seek recourse in the form of fines levied against platforms that repeatedly misflag content that conforms with that platform’s stated policies. While this solution is less than ideal, it is one which would place a higher onus on the platforms themselves while still protecting those platforms’ rights to moderate as they see fit.

First Page

172

Share

COinS