When 2017 Becomes 1984: Facial Recognition Technologies – Face a Growing Legal Landscape

Oct 5, 2017

By Dawn Ingley


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Recently, Stanford University professor and researcher Michal Kosinski caused a stir of epic proportions and conjured up visions of George Orwell’s 1984 in the artificial intelligence (AI) community.  Kosinski posited that several AI tools are now able to determine the sexual orientation of a person, just based on a photograph, and has gone on to speculate that AI could also predict political ideology, IQ, and propensity for criminal behavior.  Indeed, using a complex algorithm, Kosinski accurately pinpointed a male’s sexual orientation over 90% of the time.  While technology advances frequently outpace corresponding changes in the law, the implications of this technology are alarming.  Could the LGBTQ community be targeted for violence or other discrimination based on this analysis?  Could “potential criminals” be turned away from gainful employment based on mere speculation about future behavior?  Would Facebook account photographs be an unintentional window into the most private facets of one’s life?  In a country already divided over sociopolitical issues, the answer to all of these questions unfortunately seems to be not if, but when.  The urgency for laws and regulations to police the exponential proliferation of AI’s potential intrusions cannot be overstated as the threat of a 1984 world becomes more of a reality.

Although Kosinski’s revelation is a recent one, concerns over facial recognition technologies and biometric data are hardly novel.  In 2012, Ireland forced Facebook to disable its facial recognition software in all of Europe—the EU data privacy directive (and the upcoming GDPR) obviously would have required explicit consent from Facebook users, and such consent was never requested or received from Facebook account owners.  Facebook was also required to delete all facial profiles collected in Europe.

In the United States, Illinois appears to be ground zero for the battle over facial recognition technologies, predominantly because it’s one of the few states with a specific law on the books.  The Biometric Information Privacy Act, 740 ILCS 14, (“BIPA”) initially was a reaction to Pay by Touch, a technology available in the mid 2000s as a means to connect biometric information (e.g., fingerprints) to credit card and other accounts.  A wave of privacy-geared litigation spelled doom for Pay by Touch and its handful of competitors.  With increasing adoption of facial recognition software into popular technology platforms (such as the iPhone) BIPA is front and center once again.

The scope of BIPA includes “retina or iris scan, fingerprint, voiceprint or scan of hand or face geometry…”  Key provisions of BIPA are as follows:

  • Prohibits private entities from collecting, selling, leasing, trading or otherwise profiting from biometric data, without express written consent;
  • Requires private entities that collect biometric data to protect such data using a reasonable standard of care that is at least as protective as the methods used by the entity to protect other forms of confidential and sensitive information;
  • Requires private entities that collect such biometric data to comply with written policies “made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information, on the earlier of: i) when the initial purpose for collecting or obtaining such identifiers or information has been satisfied; or ii) within 3 years of the individual’s last interaction with the private entity.

One of the latest lawsuits filed pursuant to this law is against Shutterfly, which is accused of collecting facial, fingerprint and iris scans without express written consent from website visitors, as well as people who may be “tagged” in photos (even though they may have never used Shutterfly’s services or held a Shutterfly account).  In a similar lawsuit, filed against Facebook in 2015, users also charged that Facebook was collecting and using biometric data without specific consent.

It is likely that Apple is monitoring these cases closely.  After all, iPhone X, according to Apple’s website, uses facial recognition technology to obtain access to the phone—in essence, a “facial password.”  Apple is quick to point out that it encrypts the mapping of users’ faces and that such actual data exists only on the physical device and not elsewhere (i.e., not in a cloud).

It is also likely that lawmakers in the other two states with statutes similar to BIPA are keeping a watchful eye on the Illinois docket.  Texas and Washington both have biometric laws on the books, along the lines of BIPA (though unlike Illinois, Texas and Washington do not provide a private cause of action).  While residents of these states can take comfort in legislative remedies available there, where does that leave residents of other states?  Given that the federal approach to privacy in the United States generally tends to be sector-specific (e.g., HIPAA for medical data; Gramm-Leach Bliley for financial institutions), it seems clear that change must surface at the state level.  Until then, state residents without legal protections are left with the following options:

  • Obscuracam: an anti-facial recognition app that, as its name suggests, obscures visual characteristics in those individuals photographed.
  • Opt-out whenever possible: Facebook settings can be modified so as to allow an account holder to both opt out of facial recognition technologies, and to delete any data already collected.  Users of Google+ have to affirmatively opt in to facial recognition technologies.
  • Tangible solutions: 3-D printed glasses are sometimes effective in disrupting and/or scrambling features, thereby thwarting facial recognition technologies.

However, realistically, until the law catches up with technology, the Orwellian threat is real. As the saying goes and as 1984 illustrates time and time again, “knowledge is power.”  And when knowledge gleaned from facial recognition technology falls into the wrong hands, “absolute power absolutely corrupts.”  For lawmakers, the time is yesterday (if not sooner) for laws to catch up with the break-neck pace of facial recognition technologies and the potential slippery slope of use cases.

OTHER THOUGHT LEADERSHIP POSTS:

Apple’s X-Cellent Response to Sen. Franken’s Queries Regarding Facial Recognition Technologies

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, I wrote an article outlining the growing body of state legislation designed to address and mitigate emerging privacy concerns over facial recognition technologies.  It now appears that...

Pros and Cons of Hiring a Security Rating Agency

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below One can hardly check out any news outlet today without reading or hearing about a security breach.  Experts frequently advocate performing internal assessments to identify security...

Part II of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the second in a series of three articles on the FTC’s Stick with Security blog. Part I of this series can be found here. Over the past 15 years, the Federal Trade Commission (FTC) has...

Part I of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Over the past 15 years, the Federal Trade Commission (FTC) has brought more than 60 cases against companies for unfair or deceptive data security practices that put consumers’ personal data at...

Data Scraping, Bots and First Amendment Rights

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below A recent case involving a small workforce analytics startup fighting for its right to extract data from the largest professional networking site on the Internet may set a precedent for applying...

When 2017 Becomes 1984: Facial Recognition Technologies – Face a Growing Legal Landscape

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, Stanford University professor and researcher Michal Kosinski caused a stir of epic proportions and conjured up visions of George Orwell’s 1984 in the artificial intelligence (AI)...

PMI – An Insider’s Guide – Part 3: What to do When You’re Asked to Assist in a Potential Acquisition – Post-Integration Run Phase and the Wheels Have Come Off

By Peggy Abood See all of Our JDSupra Posts by Clicking the Badge Below See PMI – An Insider’s Guide - Part 1 here. See PMI – An Insider’s Guide - Part 2 here. See PMI – An Insider’s Guide - Part 3 here. This is the third in a series of three articles on post-merger...

PMI – An Insider’s Guide – Part 2: What to do When You’re Asked to Assist in a Potential Acquisition – Between Signed and Closed Phase

By Peggy Abood See all of Our JDSupra Posts by Clicking the Badge Below See PMI – An Insider’s Guide - Part 1 here. See PMI – An Insider’s Guide - Part 2 here. See PMI – An Insider’s Guide - Part 3 here. Your day starts with headlines screaming across the Internet –...

PMI – An Insider’s Guide – Part 1: What to do When You’re Asked to Assist with a Potential Acquisition – Due Diligence Phase

By Peggy Abood See all of Our JDSupra Posts by Clicking the Badge Below See PMI – An Insider’s Guide - Part 1 here. See PMI – An Insider’s Guide - Part 2 here. See PMI – An Insider’s Guide - Part 3 here. The Internet is jammed with articles reporting that most merger...

Just Push the Button! Instagram’s Response to Influencers, Hashtags and Disclosures

By Farah Cook See all of Our JDSupra Posts by Clicking the Badge Below In April, the Federal Trade Commission (“FTC”), after reviewing Instagram posts by celebrities, athletes, and social media influencers, issued 90 letters reminding influencers and marketers about...