With every new technology comes the possibility of great good—and great evil. Tim Berners-Lee (the nominal originator of the internet) was, himself, ambivalent about the technology that has come to pervade our lives. On the one hand, he was hopeful and optimistic about the “global village” the internet could enable for the free and open exchange of information and opinion—on the other hand, he was concerned about the loss of control of our personal data, the spread of misinformation the internet potentially enabled, and the ability of the internet to manipulate. While we may feel empowered to mitigate these dangers in our own lives, as parents we share a heightened sensitivity for protecting our children (who are now growing up in an environment in which “imaginary friends” have been subverted by cheap, portable digital devices providing 24/7 access to peers and a seemingly endless stream of information and entertainment).
The Children’s Online Privacy Protection Act (COPPA)
In the US, the government’s response to these concerns was the Children’s Online Privacy Protection Act (COPPA), which requires that websites and online services directed to children obtain parental consent before collecting, using or disclosing personal information from children under the age of 13, mandates that site owners keep such information securely, and prohibits site owners from conditioning children’s participation on collection of more data than reasonably necessary for such participation. This act was implemented by the Federal Trade Commission via the Children's Online Privacy Protection Rule, which was drafted specifically “to minimize the collection of personal information from children” and create a “safe harbor” for legitimate web-based businesses through self-regulated certification.
TikTok and the Enforcement of COPPA
The recent FTC enforcement action filed against Musical.ly (now known as “TikTok”) illustrates the dilemmas for both parents and businesses in compliance with COPPA and the COPPA Rule. The TikTok app allowed users to create short videos of lip-syncing to music and to share those videos with other users. To register for the app, users were required to provide an email address, phone number, username, first and last name, a short biography, and a profile picture. The app also allowed users to interact with other users by commenting on their videos and sending direct messages. Since 2014, more than 200 million users downloaded the TikTok app worldwide, while 65 million accounts were registered in the United States.
User accounts on TikTok were public by default, which meant that a child user’s profile bio, username, picture, and videos could be seen by other users—who apparently were not limited to children. While the site allowed their underage users to change their default setting from public to private so that only approved users could follow them, users’ profile pictures and bios remained public and users could still send them direct messages, according to the complaint. In fact, as the complaint notes, there were public reports of adults trying to inappropriately contact children via the TikTok app. In addition, until October 2016, the app included a feature that allowed users to view other users within a 50-mile radius of their location—further exacerbating instances of inappropriate contact by adults and concern by parents.
The operators of the TikTok app were aware that a significant percentage of users were younger than 13 and received thousands of complaints from parents that their children under 13 had created TikTok accounts, according to the FTC’s complaint. “The operators of … TikTok knew many children were using the app but they still failed to seek parental consent before collecting names, email addresses, and other personal information from users under the age of 13,” said FTC Chairman Joe Simons in conjunction with the $5.7 million damage settlement against TikTok.
While the complaint resulted in a consent decree rather than a court decision, the issues and results are instructive with respect to the application of COPPA/COPPA Rule and the types of enforcement actions the FTC may institute in the future.
What Comes After COPPA?
Prospectively, as the internet continues to become more pervasive, we can expect to see “more” rather than “less” attention paid to protecting a child’s online presence. U.S. Sen. Edward Markey (the Massachusetts Democrat who originally authored COPPA) has been a leader in that regard, recurrently introducing amendments to COPPA in 2013 (S. 1700), 2015 (S. 1563), and the now pending legislation with Republican Sen. Josh Hawley that would extend COPPA to “minors” (defined as users between 13 and 15), create an “eraser button” that allows parents to remove their child’s information, ban targeted marketing directed at children and minors, expand the current “actual knowledge” standard governing data collection to “constructive knowledge”, require new packaging for connected devices targeted to children (a “privacy dashboard” that describes how personal information is collected, transmitted, retained, used, and protected), and the creation of a new Youth Privacy & Marketing Division within the FTC to oversee marketing towards children and minors. Markey is also the sponsor of the Children and Media Research Advancement (CAMRA) Act, introduced in February, directing the National Institute of Health to study the effect of technology and media on infants, children, and minors with respect to cognitive, physical, and socio-economic development.
Envisioning the Future
As we use the internet and the apps it enables we provide information about ourselves—sometimes actively and consensually, but many times passively and without our knowledge. Most websites collecting this data claim it as their own proprietary information and such data has become “currency” that online businesses use to reduce end-user costs and increase profits (sometimes by sale/sharing of the information with other online businesses). While adults can rationally evaluate the tradeoff between proffering their personal information in return for reduced costs and convenience, children lack both the cognitive ability and the legal capacity to do this. The same can be said with respect to the potential risks of online contact: Adults are charged with making decisions about who and how they may be contacted by other online personas, however historically and legally this has been a parental responsibility for minors. COPPA and the COPPA Rule codified this essential parental role with respect to voluntary or active disclosures of a child’s personal information and online contact (and recent FTC enforcement and continuous amendments related to evolving technology have reinforced this essential role).
What COPPA and the COPPA Rule don’t protect against is the potential real customer of these types of online sites and apps—which is typically not the underage end-user of the site or app (who may actually get free or discounted access to these sites), but the dark pool of “big data” consumers (to whom the online sites and apps sell such information as an integral part of their business model) that feed upon and profit from the historical internet usage pattern of our children. The end result is typically a site or app serving up an ad for a sugary cereal based upon my child’s innocent interest in “breakfast” (or, more nefariously, passively scanning that movie poster in my child’s bedroom with facial pattern recognition software to serve up an ad for the movie’s new sequel). Such technological “interest mining” of our children sets the stage for inappropriately spying upon and influencing an entire generation.
--------------
Kelly Frey is a partner in Nelson Mullins Riley & Scarborough LLP’s Nashville office where he assists clients in corporate M&A, complex corporate transactions, and technology/information technology transactions (including licensing, outsourcing, and vendor relationship & management). Mr. Frey also represents indie film production & media companies and has served as Executive Producer for several feature films.