The ‘minimum age’ & other unintended consequences of COPPA

But they are far from the only reason why I’m skeptical of how much laws and regulations can protect social media users!

By Anne Collier

It’s tough to be the FTC – or anyone else trying to make rules for user-driven (social) media. It’s hard enough to make static rules address fast-changing technology. Then there’s the problem of changing understanding of consumers – the intended beneficiaries of the rules and the users of user-driven media – as we all adjust to having the data that represents so much of our everyday lives in a giant, amorphous, seemingly uncontrollable “place” called “the cloud.” But hardest of all is writing rules that govern a mash-up of business practices and human behavior updated in real time. How to regulate the business part of the equation without regulating the consumer-behavior part and without unintended consequences?

The answer everybody has found is that it can’t be done. Many unintended consequences of COPPA are already well-established. One of them is what has, in effect, amazingly become the minimum age of social networking worldwide, including for many sites not based in the US: 13, an age the FTC was associating with the collection and disclosure of kids’ personal information not their social development. That created other unintended consequences, including millions of kids lying about their age in order to set up accounts in social sites (see this), the undermining of parents’ ability to decide the appropriate age for their own kids to start using social media, and a great proportion of US parents helping their “underage” children sign up (a link to the study here).

And there are non-family-related unintended consequences of the law, in its latest proposed iteration, my ConnectSafely co-director Larry Magid wrote, such as discouraging “companies from offering services to people under 13 or even allowing pre-teens to use services that could benefit them” and making it harder for small or young companies serving children to enter the market. Those consequences can put a chill on innovation and limit options for kids.

FTC’s apparent restraint

The FTC certainly can’t undo all those unintended consequences, so it seems to have decided to minimize the creation of a lot more by sticking to closing loopholes in the 2000 rule that predated social media and catching it up with the technology. “To keep up with modern times, the FTC wants to revise COPPA rules so that they apply to third-party advertising networks along with app and plug-in developers and to expand the definition of ‘personal information’,” Larry wrote. That definition now addresses the use of “cookies” that track people from site to site – as the FTC put its, “persistent identifiers and screen or user names other than where they are used to support internal operations, and website or online service directed to children to include additional indicia that a site or service may be targeted to children.”

The revisions, “which the commission intends to complete by the end of the year, are the result of a review begun in 2010,” the New York Times reports. The FTC welcomes public comments on them, and public comments are just that – they don’t have to be formal or scholarly. They can be submitted until Sept. 10 (here’s the FTC’s form).

Self-regulation (personal, corporate) essential too

My own comment here is that our current global media environment calls for healthy skepticism toward rules and regulations at every level, from school to national to international – and not only because of the unintended consequences they can create. Just by the nature of social media, regulation can’t protect us the way it could in earlier media eras. That’s because, in a user-driven media environment, users have so much impact on their own and each other’s privacy, as well as that of the environment (the same for safety, of course).

Privacy and safety are a distributed and shared responsibility – of users (of all ages) toward themselves and each other, together with service providers, parents, schools, law enforcement, and authorities. They simply can’t be entirely ensured by laws or regulations. Laws can only help – if they’re written with an understanding of this new media reality, and that’s a big “if.” It looks to me like the FTC has shown some intelligent restraint.

Related links

* “What Do Kids Know about Online Privacy? More Than You Think”: KQED’s Mind/Shift on research from the Harvard School of Education published last March in the journal Learning, Media & Technology
* “Juvenile cyber-delinquency: Laws that are turning kids into criminals” in
* How what I call “anti-social media companies” create their own obsolescence and why pro-social is simply good business (posted last December)
* About “citizen regulators” in “SOPA & citizenship in a digital age” (January 2012)
* “Social Web privacy: A new kind of social contract we’re all signed onto” (April 2010) and, the following month, some context for the Formspring story that went national then
* “Youth privacy study: Should focus be only on parents’ views?” (October 2010)
* “A new book & fresh look at online privacy” (July 2011)
* About the youth-safety implications of what our task force report called a “living Internet” (June 2010)

Leave a comment