IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

Jun 11, 2018

By Jennifer Thompson

See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of, companies subject to COPPA.  While the FTC has pursued companies for alleged violations of all facets of its COPPA Six Step Compliance Plan, most recently the FTC has focused on the obligation to promptly and securely delete all data collected if it is no longer needed.  Taken as a whole, recent FTC activity may indicate a desire on the part of the FTC to expand its regulatory reach.

First, as documented In “IoT Device Companies:  Add COPPA to Your “To Do” Lists,” the FTC issued guidance in June, 2017, that “Internet of Things” (“IoT”) companies selling devices used by children are subject to COPPA, and may face increased scrutiny from the FTC with respect to their data collection practices.  While COPPA was originally written to apply to online service providers and websites, this guidance made it clear that COPPA’s reach extends to device companies. In general, this action focused on step 1 of the Compliance Plan (general applicability of COPPA), while also providing some guidance on how companies could comply with step 4 of the Compliance Plan (obtaining verifiable parental consent).

Then, in January, 2018, the FTC entered its first-ever settlement with an internet-connected device company resulting from alleged violations of COPPA and the FTC Act.  As discussed in “IoT Device Companies: COPPA Lessons Learned from Vtech’s FTC Settlement,” the FTC alleged violations by the device company of almost all the steps in the Compliance Plan, including failure to appropriately post privacy policies (step 2), failure to appropriately notify parents of the intended data collection activities prior to data collection (step 3), failure to verify parental consent (step 4) and failure to implement adequate security measures to protect the data collected (step 6).  The significance of the settlement was that it solidified the earlier guidance that COPPA operates to govern device companies, in addition to websites and online application providers.

In April, 2018, the FTC further expanded its regulatory reach by sending warning letters alleging potential COPPA violations to two device/application companies located outside the United States.  Both companies collected precise geolocation data on children in connection with devices worn by the children.  The warning letters clarified that, although located outside the United States, the FTC deemed the companies subject to COPPA, as: a) their services were directed at children in the United States; and b) and the companies knowingly collected data from children in the United States.  Interestingly, one of the targeted companies, Tinitell, Inc., was not even selling its devices at the time of the letter’s issuance.  Nonetheless, the FTC warned that since the Tinitell website indicated that the devices would work through September 2018: a) COPPA would continue to apply beyond the sale of the devices; and b) the company is obligated to continue to take reasonable measures to secure the data it had and would continue to collect.

Most recently, the FTC again took to its blog post to remind companies that COPPA obligations pursuant to step 6 (implement reasonable procedures to protect the security of kids’ personal information) may extend even beyond the termination of the company’s relationship with the child.  Although “reasonable security measures” is a broad concept, the FTC narrowed in on the duty to delete data that is no longer required.

Section 312.10 of COPPA states that companies may keep personal information obtained from children under the age of 13 “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.”  After the fulfillment of the purpose for which the information was collected, the information is to be deleted in such a manner and using reasonable measures to ensure that it cannot be accessed or used in connection with the deletion.

On May 31, 2018, the FTC posted a blog entitled “Under COPPA, data deletion isn’t just a good idea.  It’s the law.” which reminds website and online service providers subject to COPPA (and, by extension, any device companies that market internet-connected devices to children) that there are situations in which COPPA will require subject companies to delete the personal information it has collected from children, even if the parent does not specifically request the deletion.  This guidance establishes an affirmative duty on the company collecting the information to self-police and to securely discard the information as soon as it no longer needs it, even if the customer has not made such a request.

The blog further suggests that all companies review their data retention policies to ensure that the stated policies adequately address the following list of questions:

  • What types of personal information are you collecting from children?
  • What is your stated purpose for collecting the information?
  • How long do you need to hold on to the information to fulfill the purpose for which it was initially collected? For example, do you still need information you collected a year ago?
  • Does the purpose for using the information end with an account deletion, subscription cancellation, or account inactivity?
  • When it’s time to delete information, are you doing it securely?

It will be interesting to see if the FTC continues to focus on COPPA in its enforcement actions. All told, the FTC has brought around thirty actions pursuant to COPPA.  But recent activity, like the warning letters to international companies and the recent guidance on data deletion, indicate that the FTC may be expanding the arena for COPPA applicability.


NYC’s Task Force to Tackle Algorithmic Bias Issues Final Report

In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.

While you’ve been focused on CCPA Compliance Efforts, Elon has Been Developing Cyborgs

On November 27, 2019, the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security (DHS) released for public comment a draft of Binding Operational Directive 20-01, Develop and Publish a Vulnerability Disclosure Policy (the “Directive”).

DHS Cybersecurity Arm Directs Executive Agencies to Develop Vulnerability Disclosure Policies

On November 27, 2019, the Cybersecurity and Infrastructure Security Agency (CISA) of the Department of Homeland Security (DHS) released for public comment a draft of Binding Operational Directive 20-01, Develop and Publish a Vulnerability Disclosure Policy (the “Directive”).

Open Internet Advocates Rejoice: Ninth Circuit Finds Web Scraping of Publicly Accessible Data Likely Does Not Violate CFAA

The Ninth Circuit Court of Appeals recently handed open internet advocates a big win by upholding the right of a data analytics startup to use automated bots to scrape publicly available data.

The ABA Speaks on AI

By Jennifer Thompson | Earlier this week, the American Bar Association (“ABA”) House of Delegates, charged with developing policy for the ABA, approved Resolution 112 which urges lawyers and courts to reflect on their use (or non-use) of artificial intelligence (“AI”) in the practice of law, and to address the attendant ethical issues related to AI.

Is Anonymized Data Truly Safe From Re-Identification? Maybe not.

By Linda Henry | Across all industries, data collection is ubiquitous. One recent study estimates that over 2.5 quintillion bytes of data are created every day, and over 90% of the data in the world was generated over the last two years.

FTC Settlement Reminds IoT Companies to Employ Prudent Software Development Practices

By Linda Henry | Smart home products manufacturer D-Link Systems Inc. (D-Link) has reached a proposed settlement with the Federal Trade Commission after several years of litigation over D-Link’s security practices.

Beyond GDPR: How Brexit Affects Other Data Laws

By Dawn Ingley | Since the United Kingdom (UK) voted in June, 2016, to exit the European Union (i.e., “Brexit”), the question in many minds has been, “Whither GDPR?” After all, the UK was a substantial contributor to this legislation. The UK has offered assurances that that it intends to, in large part, harmonize its data protection laws with GDPR.

San Francisco Says The Eyes Don’t Have It: Setting Limits on Facial Recognition Technology

By Jennifer Thompson | On May 14, 2019, the San Francisco Board of Supervisors voted 8-1 to approve a proposal that will ban all city agencies, including law enforcement entities, from using facial recognition technologies in the performance of their duties.

NYC’s Task Force to Tackle Algorithmic Bias: A Study in Inertia

By Linda Henry | In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.