COPPA 2.0: A Ramble about Safety & Chat
Throughout my tenure of working in Online Entertainment for Kids, one of the largest influences on me was COPPA. Not the COPPA that goes live in July (aka, COPPA 2.0), but the COPPA that was approved in 1997 (amended in 2002).
RECAP! What is COPPA? COPPA, or Children’s Online Privacy Protection Act, is legislation that restricts web collection of personally identifiable information (aka “PII”) from children under the age of 13, without adult consent (verifiable consent collected through FTC approved measures only). It was formed to stop marketers from collecting private information from susceptible children. Often, we see this information collected during registration, in promotions, profiles, and shared within various social features that are ever so popular in just about every web experience (games, forums, chat features, profiles, etc).
In this business, I’ve had such a precarious relationship with this particular piece of legislation. You see, there is no legal option out there to restrict or censor or warn kids from inappropriate content or interactions online. A few attempts in the 2000’s were made, but none deemed constitutional for a variety of reasons (example: DOPA / freedom of speech). Naturally, no one can assume that every website has to prepare for the possibility that a child might visit, whether they’re of age or not. Such an expectation would entail every website having the equivalent of an “age gate” in place, temporarily barring visitors from content until successfully sharing an appropriate date of birth. A word to the wise: barriers ruin business (it would be like flashing your ID every time you changed the channel on your tv).
Now we have this new frontier where kids can be exposed to JUST about anything online, with little barrier or content censoring. Kind of unsettling. Thankfully, should they be so inclined, parents can track websites, and block content. Parents can assert some sort of power over the web destinations for the family. And, hallelujah, we also have family & kid-based web experiences!
In the early days, kid sites were primarily content driven with a strong marketing presence. Cheap flash games, opportunities to sign-up for newsletters, some story element, and downloadable content (like desktop widgets or coloring pages). The strong marketing presence was the prime factor for the creation of COPPA. COPPA stopped the marketers from taking advantage of kids (identity & privacy) in their own web playgrounds (well, maybe not ‘stopped,’ but definitely slowed down).
For a decade, things like age gates, email plus, and verified consent programs seemed to work. Many kid sites were treated as quick/cheap extensions of repurposed content, where product was peddled more than an experience. A few sites were heavily moderated, due to some grey area within COPPA. As mentioned numerous times, COPPA was created to restrict businesses from collecting personally identifiable information from children. What was not clear was the art of the “collection.” When people connect, it’s only natural to feel the need to share pieces of identity (“Hi, my name is Izzy Neis, and I love cookies.” seems innocent enough). Adult verification, content screening, strong filters, and gates were necessary to block PII collecting (solicited or not).
And then the social BOOM happened to the webosphere. Profile platforms, virtual worlds, and interactive opportunities hit mainstream. Sharing, showing, and engaging became crucial to any digital experience. I’ve watched a plethora of web destinations grow over the last decade, and with each new year youth-centric websites have had to improve, innovate, and compete with mainstream adult/general audience sites… and there’s the rub, competition. Kids are no dummies, if a digital experience for a kid is not as fun or interesting or engaging as what’s offered on a non-kid-targeted site, why visit? If all the fun activities are buried behind adult verification methods (thus needing a credit card, a social security number, or a fax machine with a parent’s signature), is the site even worth exploring? And so begins the conundrum for kids.
For many operators with social/moral consciouses, we were put in an interesting position.
- No, most of us do NOT have the desire to collect personal data. We don’t need it.
- Yes, we do understand it is super difficult for kids to get verified consent from a parent to unlock social abilities online.
- No, we do NOT want to expose kids to creeps, inappropriate content, bullying, or other negative aspects of social interactions.
- Yes, we understand that screening a site, hiring moderation teams, and purchasing licenses to filtering tools are all very expensive endeavors.
The last bullet is an extremely important bullet. Running a child’s social website or online game can be very expensive, and it requires a lot of sensitive architecture to develop systems that are scalable for a business (p.s. we can help with that). This makes COPPA a very particular piece of legislation. As mentioned – there is no law that restricts behaviors or inappropriate content online. Terms of Service usually designate such things, per game, as written by the company’s lawyers, and those Terms are managed by the business itself based on the idea of a “contract” between the user and the business. Again, there is no all encompassing law. COPPA, however, does restrict the sharing of certain sensitive information – not just to restrict collection by a business (as the law was intended), but by default, also restrict kids from sharing data that puts them in harm’s path for a creep or a bully or something much more dangerous (not as the law was intended).
And there you have the crux of my love/hate with COPPA, as it has been used from 1997 to 2013. Equal parts my hero (allowing me to force measures of protection into social digital experiences) and my enemy (raising barriers for kids to engage and scaling issues with cost to my department).
Many of the social features can be managed properly. Forums and blogs can be screened, profile details can be pre-written/approved, but chat? To an end user, chat is time sensitive, immediate – it’s a conversation on the fly in a room with other active participants. Like live TV, there’s no editing, and there’s a lot of sensitive viewers. Much more difficult to handle.
I was able to use COPPA (like Stretch Armstrong™) to enforce appropriate behaviors and content and COPPA within games with necessary moderation & filtering, but was under the burden of cost and the frustrations caused by so many barriers to entry or use. You would be very surprised at how difficult it is to get a parent involved with online parenting (or even just approvals).
On July 1, 2013, COPPA 2.0 will be officially “live” to the world. Over the course of the last couple of years, the FTC has listened to and engaged a variety of specialists. And although they covered a great many topics related to identity and privacy within COPPA (as we’ve covered previously in our blog), chat seems to be one of the more sensitive topics for operators. Many of us (myself included) spoke candidly about these two particular pain points:
- How very difficult it is to obtain verifiable consent from an adult (“parent”) in order to unlock chat capabilities. Verifying a credit card or a social security number to enable free-chat in a free game is frightening for parents, and that’s if you’re able to even get the parent’s attention in the first place.
- How nearly impossible it is to block certain aspects of non-solicited PII from everyday chat between users. Aside from the growing popularity of random nouns and verbs doubling as first or last names (ala Chase, America, or Nice, Straight, Ford, etc), it’s virtually impossible to find every combination of phonetic spelling or creative phrase construction to block kids from telling other kids about themselves).
After much discussion, it seems the FTC has recognized our plight. COPPA was never meant to block social interactions. It was never meant to limit peer engagement, or burn the digital kids business to the ground in unrealistic scaling costs.
[#59] If you only use the information internally, and do not disclose it to third parties or make it publicly available, then you may obtain parental consent through use of the Rule’s “email plus” mechanism, as outlined in FAQ 60 below. See 16 C.F.R. § 312.5(b)(2).
As always, we need to have the proper measures in place to ensure that data is non-collectible, even within chat, but we no longer have to assume the black watch is sitting over our shoulders waiting for the inevitable moment when clever kids circumvent our filters or moderators, etc.
What do you think about these updates? Do you think we should still have verifiable consent blocking kids from chat, in order to protect them? Or, do you think the FTC and COPPA are missing the mark?
Director of Digital Engagement & Strategy
SUBSCRIBE TO OUR BLOG
Get a weekly roundup from the world of ModSquad.
SUBSCRIBE TO OUR BLOG
Get a weekly roundup from the world of ModSquad.