Could new government rules jeopardize children’s privacy and safety?

By Larry Magid

The Federal Trade Commission’s proposed revisions of the rules that implement the Children’s Online Privacy Protection Act (COPPA) is well meaning and well-timed. It’s well meaning in that the FTC truly does care about the privacy and safety of children and it’s well-timed in that it’s the first revision since COPPA was implemented back in 1999 during the days of Web 1.0.  Back then there was no Facebook and no smart phone apps. There were websites, including some that were using information collected from children to send them marketing pitches, which is why Congress decided to clamp down with a law that requires verifiable parental consent before a site could collect personally identifiable information from children under 13.

Over the last several months, the FTC has been taking another look at COPPA, seeking to update it to modern times and, in August, issued a proposed set of new rules.

As I wrote at the time, the new rules recognized some of the changes such as the role of apps and third party plug-ins and sought to add Internet protocol addresses (IP address) to the list of “persistent identifiers” that also includes tracking cookies that are used to display ads based on a user’s behavior, in some cases on sites other than the site displaying the ad.

I also mentioned some unintended negative consequences of the rule changes, which prompted my co-director Anne Collier and me to submit an official comment for the FTC to consider as it finalizes its rule changes.

Unintended consequences & ConnectSafely’s FTC comment
Our comment, which we’ve posted on, outlines our concerns about the unintended consequences for both children and small businesses that create apps for mobile and social networking platforms.  While we are sympathetic to the (mostly) small businesses that create apps, our primary concern is, of course, for children and we believe that over-regulation could actually jeopardize children’s privacy and safety.  I know that sounds a bit far reaching but consider that the proposed rules could — among other things — require parents to verify the identity of children who click on YouTube videos if those videos are embedded on sites aimed at children — a common practice.

Requiring that parents submit child’s name and age to both the site operator and to the source of the embed, is a “remedy” that’s more harmful than the disease it seeks to cure.  Curently, Google and virtually every other site on the Internet (including this one) gets the IP address of every visitor. In theory, it could be possible to identify the household or device of that individual in some cases, but in practice is extremely difficult and generally requires a court order. But if the FTC requires parents to verify their child’s identity, that very information the rule is attempting to protect, will in fact be disclosed. That doesn’t protect children, it makes them more vulnerable.

Platforms and compliant apps
Another proposed rule change would hold both the platform developers (like Apple, Facebook and Google) and the developers who write apps for their ecosystem responsible for compliance with COPPA without providing the app developer additional resources to comply. It strike us that if the FTC wants to extend its rules to app developers (a reasonable idea), it should enable platform developers to provide the verification for them with the assurances that there are consequences for any app developer who fails to comply.   We wrote:

“An updated COPPA rule should make it practical for platform operators such as Apple, Google and Facebook to enable parents to provide verifiable consent to the platform which can then pass it on to app developers – most of whom are very small businesses with few if any resources for collecting consent on their own – with the understanding that the developers must adhere to COPPA guidelines or be subject to being kicked off the platform as well as to any potential civil or criminal consequences.

Our comment also covers the chilling impact that the rules could have on small businesses that offer education and entertainment resources to children. We worry that the cost of compliance will discourage legitimate businesses from catering to children while having no impact on sleazy businesses that will continue to violate the privacy of children and other visitors.  Of course kids will continue to seek content, but if well-meaning legitimate businesses fail to provide it, kids could be driven “underground” to sites — perhaps some operating from outside the United States — that are only more than happy to “serve” and potentially exploit them.  As my co-director Anne Collier stated it in her recent post about COPPA, “The more government tries to regulate, requiring protections that by definition restrict children’s free expression, the less likely it is that children will stick around to “enjoy” that safety. If they do go underground to sites, apps, games or other services that aren’t compliant – whether in the US or outside of it.”

COPPA could be “so 2012″

Finally, we worry that an overly specific set of COPPA rules could quickly render the legislation obsolete in the face of ever changing technology. We do support a broad framework that requires sites to operate ethically but we also feel that there needs to be an on-the-ground constant conversation involving the stakeholders including the sites and apps, parents and the kids themselves. As we said in our comment:

The last thing we want is for people to look back at COPPA revisions a few years from now and say, “that’s so 2012.”



Leave a comment