IoT Device Companies: Add COPPA to Your “To Do” Lists

Jul 5, 2017

By Jennifer Thompson


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Last week, the Federal Trade Commission (FTC) updated its guidance on the Children’s Online Privacy Protection Act (COPPA).  COPPA and the FTC’s related COPPA Rules establish guidelines to protect children under the age of 13 as they access the internet.  The recent updates issued by the FTC make it apparent that companies, when expanding their business offerings and product portfolios, must also ensure they are adequately protecting children in their potential use of these products and offerings.  Specifically, the FTC identified: 1) new business models that could cause a company to become subject to COPPA; 2) new products that are covered by COPPA; and 3) new methods for obtaining parental consent under COPPA.

New Business Models

In a June 21, 2017 business blog posting on the FTC website entitled, “FTC Updates COPPA Compliance Plan for Business,” the FTC noted that companies are using a variety of new business methodologies to collect personal information.  In particular, the FTC mentioned voice-activated devices that collect personal information.  Although the FTC did not  revise the COPPA rules to include specific language on emerging data collection business models, businesses adopting new methods of collecting personal data would be wise to comply with COPPA, as the FTC guidance suggests that these emerging data collection methodologies could affect the company’s obligations under COPPA.

New Products Covered by COPPA

Companies that are developing a variety of “Internet of Things” devices, as well as other devices utilizing geolocation and/or voice recognition technologies, may need to ensure that their products comply with COPPA, and not only if the device is specifically marketed to children.  The actual language of COPPA states that it applies to any “website or online service” that collects personal information from children under 13.  The FTC’s Six Step Compliance Plan (the “Compliance Plan”) for businesses offers a listing of what constitutes a “website or online service,” which makes it clear that COPPA applies to more than just websites and apps.  In fact, in the most recent update to the Compliance Plan, the FTC added “connected toys or other Internet of Things devices” to the list of items covered by COPPA. This new language includes not only toys, but also devices using voice recognition, geolocation services and other personal information.  It is interesting to note the breadth of the language.  “Other Internet of Things devices” could indicate that the FTC may require IoT devices beyond toys or those marketed directly to children to come under the provisions of COPPA if it is likely they could collect personal information for children under 13.

New Methods for Obtaining Parental Consent

Of course, understanding whether your company or client is subject to COPPA is just one part of the equation.  Companies subject to COPPA must obtain parental consent prior to collecting the personal information of children under 13.  The analysis set forth by the FTC in its COPPA Rule for obtaining parental consent is that the methodology must be “reasonably calculated, in light of available technology, to ensure that the person providing consent is the child’s parent.”  In its recent guidance, the FTC expanded acceptable methods for obtaining such consent by listing two new methodologies:  Knowledge-Based Authentication (KBA) and facial recognition technology.

KBA will be acceptable if the user answers “a series of knowledge-based challenge questions that would be difficult for someone other than the parent to answer.”  The FTC, in its approval of this methodology, noted that KBA is widely used with great success in the financial services industry to appropriately authenticate the identity of the user and can therefore be used under COPPA to establish parental consent, provided that: 1) the questions are dynamic, multiple choice questions with sufficient variety and quantity that the probability is low that a child could guess the answers; and 2) the difficulty of the questions is such that a child could not appropriately answer the questions.

The FTC also will accept the use of facial recognition technology as a means of establishing parental consent if the company follows a two-step process to verify the identity of the person providing the consent.  The two-step process uses the facial recognition technology to establish the legitimacy of a photo ID of the parent (such as a driver’s license or passport), and then compares that authenticated photo ID to another picture submitted by the parent giving the consent (presumably taken with the device being used to access the COPPA covered website or online service).   In approving the facial recognition technology, the FTC noted that the technology must be capable of authenticating the initial ID (by analyzing fonts, holograms, etc.  on the ID) and then analyzing the second image to ensure the second image is a picture of the actual person providing the consent (and not a picture of another photo of the parent) to ensure the parent is, in fact, the person completing the consent process.  If the technology does not verify a match, the enrollment is rejected.  After the technology verifies the two faces match, live persons at the company also review the two photos to ensure the photos match, and then the identification information should be deleted promptly (within five (5) minutes).

Conclusion

When counseling clients on new business ventures, attorneys must keep in mind the reach of COPPA, as well as other more obvious regulations.  The FTC noted that recent developments in the marketplace necessitated changes to COPPA to specifically identify new business practices and products that are subject to COPPA, as well as new ways for companies to establish the requisite parental consent.  In addition, aspects of the FTC’s guidance suggest it will adopt an expansive view of what methods, products and practices are covered under COPPA, so companies developing new IoT devices would be wise to anticipate that COPPA may apply to them.

OTHER THOUGHT LEADERSHIP POSTS:

The ABA Speaks on AI

By Jennifer Thompson | Earlier this week, the American Bar Association (“ABA”) House of Delegates, charged with developing policy for the ABA, approved Resolution 112 which urges lawyers and courts to reflect on their use (or non-use) of artificial intelligence (“AI”) in the practice of law, and to address the attendant ethical issues related to AI.

Is Anonymized Data Truly Safe From Re-Identification? Maybe not.

By Linda Henry | Across all industries, data collection is ubiquitous. One recent study estimates that over 2.5 quintillion bytes of data are created every day, and over 90% of the data in the world was generated over the last two years.

FTC Settlement Reminds IoT Companies to Employ Prudent Software Development Practices

By Linda Henry | Smart home products manufacturer D-Link Systems Inc. (D-Link) has reached a proposed settlement with the Federal Trade Commission after several years of litigation over D-Link’s security practices.

Beyond GDPR: How Brexit Affects Other Data Laws

By Dawn Ingley | Since the United Kingdom (UK) voted in June, 2016, to exit the European Union (i.e., “Brexit”), the question in many minds has been, “Whither GDPR?” After all, the UK was a substantial contributor to this legislation. The UK has offered assurances that that it intends to, in large part, harmonize its data protection laws with GDPR.

San Francisco Says The Eyes Don’t Have It: Setting Limits on Facial Recognition Technology

By Jennifer Thompson | On May 14, 2019, the San Francisco Board of Supervisors voted 8-1 to approve a proposal that will ban all city agencies, including law enforcement entities, from using facial recognition technologies in the performance of their duties.

NYC’s Task Force to Tackle Algorithmic Bias: A Study in Inertia

By Linda Henry | In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.

U.S. Lawmakers Want Companies to Check their Bias

By Linda Henry | Although algorithms are often presumed to be objective and unbiased, technology companies are under increased scrutiny for alleged discriminatory practices related to their use of artificial intelligence.

The Weight of “GDPR Lite”

By Dawn Ingley | In June, 2018, California’s legislature took the first steps to ensure that the state’s approach to data privacy was trending more closely to the European Union’s General Data Protection Regulation (GDPR), the de facto global industry standard for data protection. Though legislators have acknowledged that further refinements to the California Consumer Privacy Act (CCPA) will be necessary in the coming months, its salient requirements are known.

The ABA’s Valentine’s Gift to Same-Sex Couples: Formal Opinion 458 Requires Judges to Perform Marriages

By Jennifer Thompson | On Valentine’s Day, the American Bar Association (ABA) Standing Committee on Ethics and Professional Responsibility issued Formal Opinion 485, entitled “Judges Performing Same-Sex Marriages,” stating that judges may not decline to perform marriages for couples of the same sex.

The Intersection of Artificial Intelligence and the Model Rules of Professional Conduct

By Linda Henry | In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.