The first is the proliferation, worldwide and in U.S. states, of legislative proposals and statutes referred to as “age-appropriate design codes.” Originating in the United Kingdom, age-appropriate design codes typically apply to online services “directed to children” and subject such services to transparency, default settings, and other requirements. Chief among them is an implied obligation to conduct ongoing assessments of whether a service could be deemed “directed to children” such that it triggers application of the codes.
The second development is a well-documented push for responsible artificial intelligence (“AI”) practices in the form of new transparency and accountability frameworks. The most comprehensive such framework is the European Union’s AI Act, although similar reforms in Canada, as well as nascent reforms here in the United States, address analogous topics. Among these are requirements for AI developers to assess, document, and, in some instances, report to regulators the existence of potential harms and plans to mitigate them prior to launching a new AI-driven product or service.
The third development, certain reforms to competition policies, is least likely to be traditionally counted among “privacy” laws. However, I argue that two recent reforms in Europe—the Digital Services Act and the Digital Markets Act—implicate data privacy concerns and should be viewed as imposing privacy-related compliance obligations. For instance, these frameworks address the use of personal data, including sensitive personal information, for online advertising purposes.
PRIVACY’S NEXT ACT,
19 Wash. J. L. Tech. & Arts
Available at: https://digitalcommons.law.uw.edu/wjlta/vol19/iss1/3