•  
  •  
 
Washington Journal of Law, Technology & Arts

Abstract

This Article identifies and describes three data privacy policy developments from recent legislative sessions that may seem unrelated, but which I contend together offer clues about privacy law’s future over the short-to-medium term.

The first is the proliferation, worldwide and in U.S. states, of legislative proposals and statutes referred to as “age-appropriate design codes.” Originating in the United Kingdom, age-appropriate design codes typically apply to online services “directed to children” and subject such services to transparency, default settings, and other requirements. Chief among them is an implied obligation to conduct ongoing assessments of whether a service could be deemed “directed to children” such that it triggers application of the codes.

The second development is a well-documented push for responsible artificial intelligence (“AI”) practices in the form of new transparency and accountability frameworks. The most comprehensive such framework is the European Union’s AI Act, although similar reforms in Canada, as well as nascent reforms here in the United States, address analogous topics. Among these are requirements for AI developers to assess, document, and, in some instances, report to regulators the existence of potential harms and plans to mitigate them prior to launching a new AI-driven product or service.

The third development, certain reforms to competition policies, is least likely to be traditionally counted among “privacy” laws. However, I argue that two recent reforms in Europe—the Digital Services Act and the Digital Markets Act—implicate data privacy concerns and should be viewed as imposing privacy-related compliance obligations. For instance, these frameworks address the use of personal data, including sensitive personal information, for online advertising purposes.

My argument is that common threads across these developments underscore the dynamism of privacy law at a critical moment in its development and highlight the increased public awareness of the benefits––and risks––of a data-driven economy and society. To that end, I identify three specific trends among these developments that I anticipate recurring in data privacy policy proposals over privacy’s “next act.” First, legislators and regulators alike appear increasingly focused on age verification technologies as a mechanism for distinguishing between internet users and determining to whom they must provide certain protections. Second, there is a growing appetite for shifting assessment obligations onto regulated entities, albeit with guidance, and requiring that the results of such assessments are affirmatively disclosed to regulators. Third, privacy obligations are no longer limited to data privacy laws. They are increasingly found in other types of policy proposals––and detecting them will require a broader view of what constitutes a “privacy” law than typical among privacy professionals.

Share

COinS