Predictive Algorithms in Sentencing: Are We Automating Bias?

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling evidence that such algorithms may be racially biased.  As a result of one such investigation by ProPublica, the New York City Council recently passed the first bill in the country designed to address algorithmic discrimination in government agencies. The goal of New York City’s algorithmic accountability bill is to monitor algorithms used by municipal agencies and provide recommendations as to how to make the City’s algorithms fairer and more transparent.

The criminal justice system is one area in which governments are increasingly using algorithms, particularly in connection with creating risk assessment profiles of defendants.  For example, COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) is a computer algorithm used to score a defendant’s risk of recidivism and is one of the risk assessment tools most widely used by courts to predict recidivism.  COMPAS creates a risk assessment by comparing information regarding a defendant to historical data from groups of similar individuals.

COMPAS is only one example of proprietary software being used by courts to make sentencing decisions, and states are increasingly using software risk assessment tools such as COMPAS as a formal part of the sentencing process.  Because many of the algorithms such as COMPAS are proprietary, the source code is not published and is not subject to state or federal open record laws. As a result, the opacity inherent in proprietary programs such as COMPAS prevents third parties from seeing the data and calculations that impact sentencing decisions.

Challenges by defendants to the use of such algorithms in criminal sentencing have been unsuccessful.  In 2017, Eric Loomis, a Wisconsin defendant, unsuccessfully challenged the use of the COMPAS algorithm as a violation of his due process rights.  In 2013, Loomis was arrested and charged with five criminal counts related to a drive by shooting.  Loomis maintained that he was not involved in the shooting but pled guilty to driving a motor vehicle without the owner’s permission and fleeing from police.  At sentencing, the trial court judge sentenced Loomis to six years in prison, noting that the court ruled out probation based in part on the COMPAS risk assessment that suggested Loomis presented a high risk to re-offend.[1]Loomis appealed his sentence, arguing that the use of the risk assessment violated his constitutional right to due process.  The Wisconsin Supreme Court ultimately affirmed the lower court’s decision that it could utilize the risk assessment tool in sentencing, and also found no violation of Loomis’ due process rights.  In 2017, the U.S. Supreme Court denied Loomis’ petition for writ of certiorari.

The use of computer algorithms in risk assessments have been touted by some as a way to eliminate human bias in sentencing.  Although COMPAS and other risk assessment software programs use algorithms that are race neutral on their face, the algorithms frequently use data points that can serve as proxies for race, such as ZIP codes, education history and family history of incarceration.[2]  In addition, critics of such algorithms question the methodologies used by programs such as COMPAS, since methodologies (which are necessarily created by individuals) may unintentionally reflect human bias.  If the data sets being used to train the algorithms are not truly objective, human bias may be unintentionally baked into the algorithm, effectively automating human bias.

The investigation by ProPublica that prompted New York City’s algorithmic accountability bill found that COMPAS risk assessments were more likely to erroneously identify black defendants as presenting a high risk for recidivism at almost twice the rate as white defendants (43 percent vs 23 percent).  In addition, ProPublica’s research revealed that COMPAS risk assessments erroneously labeled white defendants as low-risk 48 percent of the time, compared to 28 percent for black defendants.  Black defendants were also 45 percent more likely to receive a higher risk score than white defendants, even after controlling for variables such as prior crimes, age and gender. [3]ProPublica’s findings raise serious concerns regarding COMPAS, however, because the calculations used to assess risk are proprietary, neither defendants nor the court systems utilizing COMPAS have visibility into why such assessments have significant rates of mislabeling among black and white defendants.

Although New York City’s algorithmic accountability bill hopes to curb algorithmic bias and bring more transparency to algorithms used across all New York City agencies, including those used in criminal sentencing, the task force faces significant hurdles.  It is unclear how the task force will make the threshold determination as to whether an algorithm disproportionately harms a particular group, or how the City will increase transparency and fairness without access to proprietary source code.  Despite the task force’s daunting task of balancing the need for more transparency against the right of companies to protect their intellectual property, critics of the use of algorithms in the criminal justice system are hopeful that New York City’s bill will encourage other cities and states to acknowledge the problem of algorithmic bias.

 


[1] State v. Loomis, 881 N.W.2d 749 (2016)

[2] Hudson, L., Technology Is Biased Too. How Do We Fix It?, FiveThirtyEight (Jul. 20, 2017), https://fivethirtyeight.com/features/technology-is-biased-too-how-do-we-fix-it/.

[3] Julia Angwin et al., Machine Bias, ProPublica (May 23, 2016), https://www .propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

OTHER THOUGHT LEADERSHIP POSTS:

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across...

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Linda Henry | Achieva Graduate

Linda Henry | Achieva Graduate

Congratulations to Deputy Counsel and VP of Operations, Linda Henry, for her graduation from Achieva®, a year-long program with Pathbuilders. It is a program focused on helping women become high-impact leaders.  The program included monthly educational workshops, peer networking and one-on-one mentoring. https://www.pathbuilders.com/pathbuilders-achieva-packs-years-into-hours/

Lizz Patrick serves as a mentor for the Pathbuilders organization and believes in its mission wholeheartedly. It goes without saying that Lizz is extremely proud of Linda for having reached the Achieva® level in the Pathbuilders Program Series!

Pathbuilders Achieva Mentors

11/20/17 #Achieva #mentees & #mentors heard from an executive panel on strategic communicatons. Lizz Patrick moderated our panel of Cathy Adams, Mike Page, Alicia Thompson, and Lynne Zappone.

When Data Scraping and the Computer Fraud and Abuse Act Collide

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use and monetization.  Data scraping has come a long way since its early days, which involved manually copying data visible on a website.  Today, data scraping is a thriving industry, and high-performance web scraping tools are fueling the big data revolution.  Like many technological advances though, the law has not kept up with the technology that enables scraping. As a result, the state of the law on data scraping remains in flux.

The federal Computer Fraud and Abuse Act (CFAA) is one statute frequently used by companies who seek to stop third-parties from harvesting data.  The CFAA imposes liability on anyone who “intentionally accesses a computer without authorization, or exceeds authorized access, and thereby obtains … information from any protected computer.”  The Supreme Court has held that the CFAA “provides two ways of committing the crime of improperly accessing a protected computer: (1) obtaining access without authorization; and (2) obtaining access with authorization but then using that access improperly.” (Musacchio v. United States).

The CFAA’s applicability to data scraping is not clear though, as it was originally intended as an anti-hacking statue, and scraping typically involves accessing publicly available data on a public website.  In order to meet the CFAA’s requirement that a third party engage in unauthorized or improper access of a website, companies often argue that use of a website in violation of the applicable terms of use (e.g., by harvesting data), constitutes unauthorized access in violation of the CFAA.

Over the past year, a handful of cases in California challenging the legality of web scraping offer a few clues as to how courts may approach future challenges to web scraping using the CFAA.   In one of the most high-profile cases involving data scraping during 2017 (HiQ Labs, Inc. v. LinkedIn Corp.), a U.S. District Court granted a preliminary injunction requested by HiQ Labs, a small workforce analytics startup, and ordered LinkedIn to remove technology that would prevent hiQ Labs from accessing information on public profiles.  LinkedIn argued that hiQ Labs was violating LinkedIn’s terms of use as both a user and an advertiser by using bots to scrape data from LinkedIn users’ public profiles.   hiQ Labs rejected LinkedIn’s argument that the CFAA applied, and maintained that because social media platforms should be treated as a public forum, hiQ Labs’s data scraping activities are protected by the First Amendment.

In hiQ, U.S. District Court Judge Chen found, in part, that because authorization is not necessary to access publicly available profile pages, LinkedIn was not likely to prevail on its CFAA claim even if hiQ Labs had violated the terms of use.  Judge Chen did note that LinkedIn’s construction of the CFAA was not without basis, because “visiting a website accesses the host computer in one literal sense, and where authorization has been revoked by the website host, that “access” can be said to be “without authorization.  However, whether access to a publicly viewable site may be deemed “without authorization” under the CFAA where the website host purports to revoke permission is not free from ambiguity.”

Judge Chen reasoned that LinkedIn’s interpretation of the CFAA would allow a company to revoke authorization to a publicly available website at any time and for any reason, and then invoke the CFAA for enforcement, exposing an individual to both criminal and civil liability.  He characterized the possibility of criminalizing the act of viewing of a public website in violation of an order from a private entity as “effectuating the digital equivalence of Medusa.”

While LinkedIn waits for the Ninth Circuit to hear oral arguments in hiQ, yet another company (3taps Inc.) has filed a similar suit against LinkedIn, seeking a declaratory judgement that 3taps is not violating the CFAA and thus should be permitted to continue to extract data on public LinkedIn profile pages. (3taps Inc. v. LinkedIn Corp.).  In addition, because 3taps successfully argued that the court should deem the 3taps and hiQ matters related and heard by the same judge, on February 22, 2018, Judge Chen ordered the reassignment of the 3taps case from the Northern District of California’s San Jose court to Judge Chen’s court in San Francisco.

In addition to hiQ, the recent dismissal of a CFAA claim brought by Ticketmaster against a company engaged in data scraping further calls into question whether companies will be successful in using the CFAA to stop web scraping. (Ticketmaster L.L.C. v. Prestige Entertainment, Inc.).  In January 2018, a California district court dismissed Ticketmaster’s CFAA claim with leave to amend against a ticket broker that used bots to purchase tickets in bulk from the Ticketmaster site.  The court noted that although Ticketmaster outlined the defendants’ terms of use violations in a cease and desist letter, Ticketmaster did not actually revoke access authority and implied that defendants could continue to use Ticketmaster’s website as long as the defendants abided by the terms of use. In addition, the court maintained that Ticketmaster could not base a CFAA claim on an argument that the defendants exceeded authorized access unless Ticketmaster could demonstrate that the defendants were inside hackers who accessed unauthorized information.

hiQ, 3taps and Ticketmaster demonstrate the inherent difficulty in trying apply a statute that pre-dates the internet age to modern technology.  Although courts have not been consistent in their opinion as to whether violation of a company’s terms of use constitutes unauthorized or improper access under the CFAA, Ticketmaster and hiQ offer data scrapers hope that courts will continue to question whether the CFAA should prohibit harvesting publicly available data.  Companies who utilize data scraping should, however, consider that a court would be more likely to impose liability under the CFAA if the data collected is not publicly available or the methods used to obtain the data can more clearly be characterized as unauthorized access.  The Ninth Circuit is expected to hear oral arguments in hiQ in March, and the court’s interpretation of the CFAA is likely to have a significant impact on the use of automated processes to use third-party data.

OTHER THOUGHT LEADERSHIP POSTS:

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across...

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to consumers’ sensitive personal information, transparency is key, and failure to assess and address security risks created by third-party software vendors may be deemed an unfair act or practice under Section 5 of the FTC Act.

Lenovo’s problems began in August 2014 when Lenovo began selling laptops to consumers with preinstalled “man-in-the-middle” software provided by a third-party vendor, Superfish, Inc.  The software delivered pop-up ads notifying consumers of similar products sold by Superfish’s retail partners when consumers hovered over a product image on a shopping website.

In order to inject pop-up ads into encrypted connections, the software replaced the digital certificates for websites visited by consumers with Superfish’s own digital certificate, which had been installed in the laptop’s operating system.  As a result, there was no longer a direct, encrypted connection between the websites visited by consumers and their Internet browsers.  Superfish’s software was acting as a man-in-the-middle, and was decrypting and then re-encrypting the information traveling between the browsers and the websites. Consequently, Superfish’s software provided access to all personal information transmitted by consumers over the Internet, including login credentials, Social Security numbers, medical information, and financial information.  The FTC noted that although Superfish collected a more limited subset of consumer information, the software had the ability to collect additional information at any time.

In addition, the Superfish software replaced websites’ digital certificates without sufficiently verifying that the websites’ certificates were valid, and Superfish used the same insufficiently complex encryption key password on all laptops.  As a result, potential attackers could intercept consumers’ communications with websites by hacking the encryption key’s password “Komodia” (the name of the vendor that provided the code used by Superfish in its software).

The FTC’s complaint alleged that Lenovo’s failure to disclose the fact that pre-installed software would act as a man-in-the-middle between consumers and all websites with which consumers communicated, and that the Software would also collect and transmit consumer Internet browsing data to Superfish, was an unfair or deceptive act or practice.  The FTC also maintained that Lenovo had engaged in an unfair act or practice by failing to adequately assess (and then address) security risks created by the Superfish software Lenovo pre-loaded on consumer laptops.

“Lenovo compromised consumers’ privacy when it preloaded software that could access consumers’ sensitive information without adequate notice or consent to its use,” said Acting FTC Chairman Maureen Ohlhausen. “This conduct is even more serious because the software compromised online security protections that consumers rely on.”

The FTC’s subsequent commentary on the Lenovo settlement, together with past guidance provided by the FTC, offers several takeaways:

  • Be transparent.  Transparency is always the best policy when considering the privacy of consumers’ personal information.  Lenovo failed to adequately disclose to consumers (let alone get their consent) that a third-party would be able to intercept all of their online communications, or that man-in-the-middle software would transmit browsing data to a third party.  The FTC has made clear that businesses must clearly explain to consumers how their data will be used and provide an easy way for consumers to opt out of data use or collection practices involving their personal information.
  • Disclosures must be conspicuous and complete.  On the Lenovo laptops, a consumer did see a one-time popup window the first time the consumer visited a shopping website.  The popup window included the following message: “Explore shopping with VisualDiscovery: Your browser is enabled with VisualDiscovery which lets you discover visually similar products and best prices while you shop.”  Although the pop-up window did include a small opt-out link, it was not conspicuous and thus easy for consumers to miss.  If a consumer clicked anywhere on the screen, or on the “x” button to close the pop-up, the consumer was automatically opted in to the software.

The FTC found that this initial pop-up window did not adequately disclose that the pre-installed software would act as a man-in-the-middle between consumers and the websites they visited, and consumers would have found the collection and transmittal of their sensitive information through this software a material fact when deciding whether to opt-into the pre-installed software.  In addition, had a consumer clicked on the opt-out link, although the consumer would have successfully opted-out of receiving the pop-up ads, the software would continue to act as man-in-the-middle, and thus would continue to expose consumer information despite the election to opt out.  The FTC also noted that neither the End User License Agreement nor the Privacy Policy for the Superfish software included a disclosure regarding the collection and use of consumers’ sensitive information.

  • Undertake adequate due diligence and include security requirements in Agreements. Companies are ultimately responsible for their third-party vendors and are expected to ensure that service providers implement reasonable measures to address security risks. As the FTC noted in its Stick with Security guide published in 2017, companies should take a “trust, but verify” approach to their service providers and undertake adequate due diligence to confirm that their service providers have sufficient security controls in place to maintain the security of sensitive data.  Companies should also include appropriate security requirements in their agreements with service providers.  The FTC may view a company’s failure to hold service providers to specific security requirements as a missed opportunity to take reasonable steps to safeguard customers’ data.
  • Verify compliance.  Although due diligence and contractual requirements with service providers are important components of a company’s data security policy, a company should also verify that its service providers are complying with contractual requirements.

As part of the settlement, Lenovo is prohibited from pre-installing similar software unless Lenovo (i) obtains a consumer’s affirmative, express consent, (ii) provides instructions as to how a consumer can revoke consent, and (iii) provides an option for consumers to opt-out, disable or remove the software or its offending features.  In addition, for the next twenty years, Lenovo must maintain a comprehensive software security program that is reasonably designed to address software security risks related to the development and management of new and existing application software, and protect the security, confidentiality, and integrity of sensitive information.  Acting Chairman Ohlhausen noted that the Lenovo settlement sends a message that “everyone in the chain really needs to pay attention.”

OTHER THOUGHT LEADERSHIP POSTS:

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across...

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here.

Over the past 15 years, the Federal Trade Commission (FTC) has brought more than 60 cases against companies for unfair or deceptive data security practices that put consumers’ personal data at unreasonable risk.  Although the FTC has stated that the touchstone of its approach to data security is reasonableness, the FTC has faced considerable criticism from the business community for lack of clarity as to as to what it considers reasonable data security.

Earlier this year, FTC Acting Chairman Maureen Ohlhausen pledged greater transparency concerning practices that contribute to reasonable data security.  As a follow-up to Ohlhausen’s pledge, the FTC published a weekly blog over the past few months, Stick with Security, that focuses on the ten principles outlined in its Start with Security Guide for Businesses. In the blog, the FTC uses examples taken from complaints and orders to offer additional clarity on each principle included in the Start with Security guidelines.

This is the third of three articles reviewing the security principles discussed by the FTC in its Stick with Security blog.

Apply sound security practices when developing new products

Train your engineers in secure coding.  Sound security practices should be part of the product development process, and security should be considered at every stage.  The FTC stresses that companies must create a work environment that encourages employees to consider potential security issues throughout development.  The push to launch a product should not come at the cost of data security.

Follow platform guidelines for security.   All major platforms provide security guidelines and best practices, and the FTC strongly urges companies to consider such recommendations during product development.  For example, if a platform makes an API available to mobile app developers that will provide industry-standard encryption, a company would be well advised to consider using the platform’s API to help protect sensitive data that will be collected by the mobile app.

Verify that security features work.  Products should be tested for security vulnerabilities prior to launch.  In addition, any representation made to consumers with respect to a product’s security must be supported by demonstrable evidence prior to making the product available to consumers.  Under the FTC Act, companies will be responsible for any express or implied representation made to consumers.  Consequently, companies should consider whether any statement or depiction included in any marketing materials, packaging, social media posts, privacy policies, or in any other company content would be understood by a consumer acting reasonably under the circumstance to constitute a promise or representation regarding the product’s security.  If so, such statements or depictions must meet truth-in-advertising standards.

Test for common vulnerabilities.  Although it may not be possible to remove the threat of all security vulnerabilities, companies should use the security tools that are available to reduce the risk of a data breach and protect against known risks.  In addition, companies must view security as a dynamic process, and take new threats and vulnerabilities into account when designing new or updated products.

Make sure your service providers implement reasonable security measures

Do your due diligence.  The FTC cautions companies to take a “trust, but verify” approach to their service providers.  Companies must undertake adequate due diligence to confirm that their service providers have sufficient security controls in place to maintain the security of sensitive data.

Put it in writing. In order to reduce the risk of a service provider failing to maintain adequate security practices, companies must include appropriate security requirements in their agreements with service providers.  Failure to hold service providers to specific security requirements as a contractual matter is a missed opportunity to take reasonable steps to safeguard customers’ data.

Verify Compliance.  Although due diligence and contractual requirements with service providers are important components of a company’s data security policy, a company should also verify that its service providers are complying with contractual requirements.  For example, if a retailer engages a third party to develop and launch a mobile app but wants to ensure that geolocation data is not collected, the retailer’s agreement with the mobile app developer should include a prohibition on the mobile app being enabled to collect geolocation data from end users unless an individual affirmatively opts in.  Prior to launching the app, the retailer should conduct a test of the app to ensure that any compliance issues are corrected prior to launch.

Put procedures in place to keep your security current and address vulnerabilities that may arise.

Update and patch software.  Security is an ever-evolving process, thus companies need to ensure that third-party software is kept up-to date by promptly applying security patches and updates. In addition, if a company has made its own proprietary software available to customers, the company must ensure that it has a way to alert customers to known vulnerabilities and can provide the necessary patches and updates.  A company that fails to alert its customers to a patch that is necessary to address a software vulnerability is exposing consumers’ sensitive information to unnecessary risk.

Plan how you will deliver security updates for your product’s software. Companies should assume that they will discover software vulnerabilities in the future.  As a result, companies should anticipate the future need to release security updates after the product has launched.  As an example of prudent security practices, the FTC provides the example of a company that manufactures a thermostat that connects to the internet.  The company configures the thermostat’s default settings to install security patches released by the company, thus offering consumers a more secure product by design.

Heed credible security warnings and move quickly to fix the problem.  Due to the ever-evolving nature of technology and cybersecurity threats, Companies should keep up-to-date on new threats, and modify their security requirements accordingly.  In addition, companies must ensure that there is a clear path to reporting potential security vulnerabilities to individuals who are best positioned to take action if necessary.  As an example of a good process for reporting potential security issues, the FTC describes an app developer that receives thousands of emails a day.  Because of the large volume of daily email, the app developer directs customers to a specific email address (separate from the developer’s general email) to report security concerns, and has a knowledgeable employee monitor the mailbox and immediately flag plausible concerns for the company’s security engineers.  The FTC notes that by implementing such a procedure for reporting security concerns, the app developer may be able to mitigate the risk of a security incident.

Secure paper, physical media, and devices

Securely store sensitive files.  In addition to safeguarding digital data, Companies must also implement adequate security protections for paper documents.  For example, a company that stores files with sensitive information in an unsecured storage room has created unnecessary risk that sensitive information could be misappropriated.  A more prudent practice would be for the company to keep such files in a location with restricted access that is kept locked at all times.

Protect devices that process personal information.  If stolen, devices that store and process confidential data may offer easy access to not only the data on the stolen device, but also access to additional information on a company network.  As an illustration of prudent security practices, the FTC describes a data processing firm’s security practices with respect to employee smartphone use.  The company encrypted all data on the phones and required employees to password protect their devices.  In addition, the company safeguarded against security breaches due to lost phones by using device-finding services and applications that would remotely wipe missing devices.  Employees were also trained on the importance of following the mobile device security requirements and the company also stressed the importance of promptly reporting lost phones.

Keep safety standards in place when data is en route.  Just as companies need to safeguard sensitive digital data through encryption, companies must also use reasonable security practices when physically transferring sensitive information.  For example, a company assigned an employee to collect purchase orders with sensitive consumer information from various company locations on a daily basis.  During a personal errand, the purchase orders were stolen from the back of the employee’s car after she left the orders unattended in her car. The FTC notes that the company contributed to the risk of unauthorized access of the information included in the purchase orders because the company failed to train its employees as to how they should safeguard documents while in transit.

Dispose of sensitive data securely. Prudent security practices include document and data destruction protocols.  Companies should remember that businesses subject to the Fair Credit Reporting Act are also subject to requirements regarding the disposal of sensitive data as a matter of law.

Although the FTC’s Stick with Security blog provides guidance regarding practices that contribute to reasonable data security, the FTC stresses that data security cannot be condensed into a one-and-done checklist.  Companies must consider what is reasonable considering the nature of a company’s business, the sensitivity and volume of information collected, the size and complexity of data operations, and the cost of available tools to improve security and reduce vulnerabilities. In addition, companies must remember that security measures that were adequate last year may no longer offer adequate protection from future threats.

OTHER THOUGHT LEADERSHIP POSTS:

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across...

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...

Part II of III | FTC Provides Guidance on Reasonable Data Security Practices

Part II of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

This is the second in a series of three articles on the FTC’s Stick with Security blog. Part I and Part III of this series can be found here.

Over the past 15 years, the Federal Trade Commission (FTC) has brought more than 60 cases against companies for unfair or deceptive data security practices that put consumers’ personal data at unreasonable risk.  Although the FTC has stated that the touchstone of its approach to data security is reasonableness, the FTC has faced considerable criticism from the business community for lack of clarity as to as to what it considers reasonable data security.

Earlier this year, FTC Acting Chairman Maureen Ohlhausen pledged greater transparency concerning practices that contribute to reasonable data security.  As a follow-up to Ohlhausen’s pledge, the FTC published a weekly blog over the past few months, Stick with Security, that focuses on the ten principles outlined in its Start with Security Guide for Businesses. In the blog, the FTC uses examples taken from complaints and orders to offer additional clarity on each principle included in the Start with Security guidelines.

This is the second of three articles reviewing the security principles discussed by the FTC in its Stick with Security blog.

Store sensitive personal information securely and protect it during transmission.

Keep sensitive information secure throughout its lifecycle.  Businesses must understand how data travels through their network in order to safeguard sensitive information throughout the entire data life cycle. For example, a real estate company that collected sensitive financial data from prospective home buyers encrypted information sent from the customer’s browser to the company’s servers.   After data had entered the company’s system, however, the data was decrypted and sent in readable text to various branch offices.  Consequently, as a result of such decryption, the company failed to maintain appropriate security through the entire life cycle of the sensitive information being collected by the company.

Use industry-tested and accepted methods.  Although the market may reward products that are novel and unique, the FTC makes clear that when it comes to encryption methods, the FTC expects companies to use industry-tested and approved encryption.  As an example of a company that may not have employed reasonable security practices, the FTC describes an app developer that utilized its own proprietary method to encrypt data.  The more prudent decision would have been to deploy industry accepted encryption algorithms rather than the company’s own proprietary method, which was not industry tested and approved.

Ensure proper configuration.  Strong encryption is necessary, but not enough to protect sensitive data.  For example, the FTC found that a company misrepresented the security of its mobile app and failed to secure the transmission of sensitive personal information by disabling the SSL certification verification that would have protected consumers’ data.   A second example provided by the FTC of problematic configuration involved a travel company that used a Transport Layer Security (TLS) protocol to establish encrypted connections with consumers.  Although the company’s use of the TLS protocol was a prudent security practice,  the company then disabled the process to validate the TLS certificate.  The FTC notes that the company failed to follow recommendations from app developer platform providers by disabling the default validation settings and thus may not have used reasonable data security practices.

Segment your network and monitor who’s trying to get in and out.

Segment your network.  Companies that segment their network may minimize the impact of a data breach.  For example, use of firewalls to create separate areas on a network may reduce the amount of data that is accessed in the event hackers are able to gain access to a network.  The FTC provides the example of a retail chain that failed to adequately segment its network by permitting unrestricted data connections across its stores (e.g., allowing a computer from a store in one location to access employee information from another store).  By allowing unrestricted data connections across locations, hackers were able to use a security lapse in one location to gain access to sensitive data in other locations on the network.

Monitor activities on your network.  Companies should take advantage of the various tools available to alert businesses of suspicious activities, including unauthorized attempts to access a network, attempts to install malicious software and suspicious data exfiltration.

Secure remote access to your network.

               Ensure endpoint security.   Every device with a remote network connection creates a possible entry point for unauthorized access.  Consequently, securing the various endpoints on a network has become increasingly important with the rise in mobile threats.  Companies should establish security rules for employees, clients and service providers, including requirements concerning software updates and patches.  Establishing security protocols is not sufficient alone though, as companies should also verify that the security requirements are being followed.  In addition, companies must continually re-evaluate possible security threats and update endpoint security requirements and controls.

Put sensible access limits in place.  Companies should establish sensible limitations on remote network access.  For example, a company that engaged multiple vendors to remotely install and maintain software on the company’s network provided user accounts with full administrative privileges for each vendor.  The FTC notes that instead of providing all vendors with administrative access, the company should have provided full administrative privileges only to those vendors who truly required such access, and only for a limited period of time.  In addition, the company should have ensured that it could audit all vendor activities on the network, and attribute account use to individual vendor employees.

Part III of this series will discuss the importance of applying sound security practices, the security practices of service providers, procedures to keep security current and the need to secure paper, physical media and devices.

OTHER THOUGHT LEADERSHIP POSTS:

Good, Bad or Ugly? Implementation of Ethical Standards In the Age of AI

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below With the explosion of artificial intelligence (AI) implementations, several technology organizations have established AI ethics teams to ensure that their respective and myriad uses across...

IoT Device Companies: The FTC is Monitoring Your COPPA Data Deletion Duties and More

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below Recent Federal Trade Commission (FTC) activities with respect to the Children’s Online Privacy Protection Act (COPPA) demonstrate a continued interest in, and increased scrutiny of,...

Predictive Algorithms in Sentencing: Are We Automating Bias?

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below Although algorithms are often presumed to be objective and unbiased, recent investigations into algorithms used in the criminal justice system to predict recidivism have produced compelling...

My Car Made Me Do It: Tales from a Telematics Trial

By Dawn Ingley See all of Our JDSupra Posts by Clicking the Badge Below Recently, my automobile insurance company gauged my interest in saving up to 20% on insurance premiums.  The catch?  For three months, I would be required to install a plug-in monitor that...

When Data Scraping and the Computer Fraud and Abuse Act Collide

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below As the volume of data available on the internet continues to increase at an extraordinary pace, it is no surprise that many companies are eager to harvest publicly available data for their own use...

Is Your Bug Bounty Program Uber Risky?

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In October 2016, Uber discovered that the personal contact information of some 57 million Uber customers and drivers, as well as the driver’s license numbers of over 600,000 United States...

IoT Device Companies: COPPA Lessons Learned from VTech’s FTC Settlement

By Jennifer Thompson See all of Our JDSupra Posts by Clicking the Badge Below In “IoT Device Companies:  Add COPPA to Your "To Do" Lists,” I summarized the Federal Trade Commission (FTC)’s June, 2017 guidance that IoT companies selling devices used by children will be...

Beware of the Man-in-the-Middle: Lessons from the FTC’s Lenovo Settlement

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below The Federal Trade Commission’s recent approval of a final settlement with Lenovo (United States) Inc., one of the world’s largest computer manufacturers, offers a reminder that when it comes to...

#TheFTCisWatchingYou: Influencers, Hashtags and Disclosures 2017 Year End Review

Influencer marketing, hashtags and proper disclosures were the hot button topic for the Federal Trade Commission (the “FTC”) in 2017, so let’s take a look at just how the FTC has influenced Social Media Influencer Marketing in 2017. First, following up on the more...

Part III of III | FTC Provides Guidance on Reasonable Data Security Practices

By Linda Henry See all of Our JDSupra Posts by Clicking the Badge Below This is the third in a series of three articles on the FTC’s Stick with Security blog. Part I and Part II of this series can be found here and here. Over the past 15 years, the Federal Trade...