When 2017 Becomes 1984: Facial Recognition Technologies – Face a Growing Legal Landscape
Recently, Stanford University professor and researcher Michal Kosinski caused a stir of epic proportions and conjured up visions of George Orwell’s 1984 in the artificial intelligence (AI) community. Kosinski posited that several AI tools are now able to determine the sexual orientation of a person, just based on a photograph, and has gone on to speculate that AI could also predict political ideology, IQ, and propensity for criminal behavior. Indeed, using a complex algorithm, Kosinski accurately pinpointed a male’s sexual orientation over 90% of the time. While technology advances frequently outpace corresponding changes in the law, the implications of this technology are alarming. Could the LGBTQ community be targeted for violence or other discrimination based on this analysis? Could “potential criminals” be turned away from gainful employment based on mere speculation about future behavior? Would Facebook account photographs be an unintentional window into the most private facets of one’s life? In a country already divided over sociopolitical issues, the answer to all of these questions unfortunately seems to be not if, but when. The urgency for laws and regulations to police the exponential proliferation of AI’s potential intrusions cannot be overstated as the threat of a 1984 world becomes more of a reality.
Although Kosinski’s revelation is a recent one, concerns over facial recognition technologies and biometric data are hardly novel. In 2012, Ireland forced Facebook to disable its facial recognition software in all of Europe—the EU data privacy directive (and the upcoming GDPR) obviously would have required explicit consent from Facebook users, and such consent was never requested or received from Facebook account owners. Facebook was also required to delete all facial profiles collected in Europe.
In the United States, Illinois appears to be ground zero for the battle over facial recognition technologies, predominantly because it’s one of the few states with a specific law on the books. The Biometric Information Privacy Act, 740 ILCS 14, (“BIPA”) initially was a reaction to Pay by Touch, a technology available in the mid 2000s as a means to connect biometric information (e.g., fingerprints) to credit card and other accounts. A wave of privacy-geared litigation spelled doom for Pay by Touch and its handful of competitors. With increasing adoption of facial recognition software into popular technology platforms (such as the iPhone) BIPA is front and center once again.
The scope of BIPA includes “retina or iris scan, fingerprint, voiceprint or scan of hand or face geometry…” Key provisions of BIPA are as follows:
- Prohibits private entities from collecting, selling, leasing, trading or otherwise profiting from biometric data, without express written consent;
- Requires private entities that collect biometric data to protect such data using a reasonable standard of care that is at least as protective as the methods used by the entity to protect other forms of confidential and sensitive information;
- Requires private entities that collect such biometric data to comply with written policies “made available to the public, establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information, on the earlier of: i) when the initial purpose for collecting or obtaining such identifiers or information has been satisfied; or ii) within 3 years of the individual’s last interaction with the private entity.
One of the latest lawsuits filed pursuant to this law is against Shutterfly, which is accused of collecting facial, fingerprint and iris scans without express written consent from website visitors, as well as people who may be “tagged” in photos (even though they may have never used Shutterfly’s services or held a Shutterfly account). In a similar lawsuit, filed against Facebook in 2015, users also charged that Facebook was collecting and using biometric data without specific consent.
It is likely that Apple is monitoring these cases closely. After all, iPhone X, according to Apple’s website, uses facial recognition technology to obtain access to the phone—in essence, a “facial password.” Apple is quick to point out that it encrypts the mapping of users’ faces and that such actual data exists only on the physical device and not elsewhere (i.e., not in a cloud).
It is also likely that lawmakers in the other two states with statutes similar to BIPA are keeping a watchful eye on the Illinois docket. Texas and Washington both have biometric laws on the books, along the lines of BIPA (though unlike Illinois, Texas and Washington do not provide a private cause of action). While residents of these states can take comfort in legislative remedies available there, where does that leave residents of other states? Given that the federal approach to privacy in the United States generally tends to be sector-specific (e.g., HIPAA for medical data; Gramm-Leach Bliley for financial institutions), it seems clear that change must surface at the state level. Until then, state residents without legal protections are left with the following options:
- Obscuracam: an anti-facial recognition app that, as its name suggests, obscures visual characteristics in those individuals photographed.
- Opt-out whenever possible: Facebook settings can be modified so as to allow an account holder to both opt out of facial recognition technologies, and to delete any data already collected. Users of Google+ have to affirmatively opt in to facial recognition technologies.
- Tangible solutions: 3-D printed glasses are sometimes effective in disrupting and/or scrambling features, thereby thwarting facial recognition technologies.
However, realistically, until the law catches up with technology, the Orwellian threat is real. As the saying goes and as 1984 illustrates time and time again, “knowledge is power.” And when knowledge gleaned from facial recognition technology falls into the wrong hands, “absolute power absolutely corrupts.” For lawmakers, the time is yesterday (if not sooner) for laws to catch up with the break-neck pace of facial recognition technologies and the potential slippery slope of use cases.