Home Cartoon budget Age-appropriate codes – well-meaning, but do they make good law? | Kelley Drye & Warren LLP

Age-appropriate codes – well-meaning, but do they make good law? | Kelley Drye & Warren LLP

0

As we’ve discussed here, there’s bipartisan momentum in Congress to pass stronger privacy protections for children and teens — and specifically, tools that would allow minors and their parents to limit algorithms. and online content that fuels self-harm and addictive behaviors. These efforts, reflected in several federal bills (see here and here) and now in a California bill as well, are based on months of testimony from a social media insider and are largely modeled after the code. UK age appropriate design.

In his State of the Union address, the president added to that momentum, calling on Congress to pass stronger protections for children – a move that has been heralded in the media as a potential “game changer”. for privacy that could “help eliminate the stalemate in the US Parliament.” (As a result, language in the report accompanying the recently signed budget bill directs the FTC to prioritize children’s privacy in its enforcement efforts.)

It is certainly understandable that US policymakers want to protect the privacy and security of minors. It should also be noted that they focus on an area where bipartisan action might be possible and emphasize the security certain aspects of these bills (as if the word “privacy” might hurt the effort while “security” might garner more support). But, beyond the good intentions of protecting children, some of the concepts and language of these bills pose real problems of clarity and applicability.

Focus on a few:

  • The best interests of the minor. The bills generally require companies to design and operate online services used by minors with the best interests of minors as the primary consideration.
    • This language raises real questions about implementation and applicability. While bills sometimes include factors to consider (e.g. types of harm to avoid) or authorize regulations or task forces to flesh out the standards, this language is full of subjectivity and will be difficult to interpret. and to apply.
    • For example, if a company demonstrates that it has made a good faith effort to develop policies to address this issue, will that be enough? Will companies be able to develop a uniform set of criteria that apply to all minors when these kinds of judgments are normally left to parents? Will policymakers or task forces really be able to flesh out the standards in ways the drafters of the bills apparently concluded they could not?
  • Avoid “dark patterns” or “nudge” techniques. The bills typically state that companies must avoid designing interfaces or techniques that result in excessive use of an online service, or that encourage minors to provide more data, waive privacy or engage in harmful behavior.
    • Some aspects of these standards will be easier to apply than others. For example, it seems clear that companies should not expressly encourage minors to provide more personal data or change settings. They should also not offer bold and attractive “yes” options for data collection and sharing, unlike tiny or hidden “no” choices. And, of course, canceling a service shouldn’t be any harder than signing up.
    • But much of this is in a gray area. Is this a “dark model” for allowing minors to win and advance in a game that, as parents well know, gets kids playing? What about game interfaces with vivid images and graphical detail – a dominant feature of the most popular video games? Will they follow the path of Joe Camel (the ubiquitous cartoon character in tobacco commercials who ended amid controversy and litigation in the late 90s)? Is a portal used by children inherently problematic because it encourages minors to return again and again to access varied and changing content? And, particularly relevant to the concerns driving these efforts, will companies have to block content about bulimia, suicide, female circumcision or sexual activity if that is precisely the information that young teens are looking for?
  • Likely to be accessed by a minor. Many of the bills’ provisions — including best interests and dark scheme requirements, as well as provisions requiring parental controls and strong default settings — relate to whether an online service is “likely to be consulted by a minor”.
    • This standard is very confusing and will be extremely difficult to apply. Unlike COPPA – which covers online services “directed at children” or circumstances where an online service has actual knowledge that a user is a child – this standard will require companies to anticipate minor access even if the company did not design its service for minors, and even if it has no precise knowledge that minors use it.
    • Although COPPA has been criticized for being too narrow, this new standard could be entirely unenforceable. While some companies are well aware that minors are using their services, others are not. Will this approach inevitably lead to universal identification and age classification of all users of all online services? Given how easily minors can circumvent age barriers, will that even be enough, or will companies need to put in place more comprehensive data collection and monitoring systems? And would these results really advance user privacy?

Certainly, the concerns driving these efforts – the harmful effects of social media on minors – are serious. They also bring together members of different political parties, which is always a welcome development. However, as policymakers and stakeholders study these bills, they will likely (or hopefully) realize just how difficult implementation would be, sending them back to the drawing board for another try. . Or perhaps they will eventually conclude that comprehensive privacy legislation remains the best approach.

[View source.]