NYC’s Task Force to Tackle Algorithmic Bias: A Study in Inertia

May 2, 2019

By Linda Henry


See all of Our JDSupra Posts by Clicking the Badge Below

View Patrick Law Group, LLC

In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies. Local Law 49 created an Automated Decision Systems Task Force (“Task Force”) to monitor algorithms used by municipal agencies, and directed the Task Force to provide recommendations by the fall of 2019 as to how to make algorithms used by the City fairer and more transparent.  Despite the Task Force’s daunting task of balancing the need for more transparency with respect to the City’s use of algorithms against the right of companies to protect their intellectual property, many people were hopeful that Local Law 49 would encourage other cities and states to acknowledge the problem of algorithmic bias.

The legislation arose after then-Councilman James Vacca read a ProPublica investigation about a computer algorithm used to score a criminal defendant’s risk of recidivism.  The ProPublica investigation found that the most widely used risk assessment (COMPAS) was more likely to erroneously identify black defendants as presenting a high risk for recidivism at almost twice the rate as white defendants (43 percent vs 23 percent).  In addition, ProPublica’s research revealed that COMPAS risk assessments erroneously labeled white defendants as low-risk 48 percent of the time, compared to 28 percent for black defendants.  Black defendants were also 45 percent more likely to receive a higher risk score than white defendants, even after controlling for variables such as prior crimes, age and gender. 

The original bill proposed by Vacca and inspired by the ProPublica investigation was much more ambitious than the final bill, and would have required city agencies to publish the source code of certain algorithms that imposed  penalties on individuals.  Nevertheless, Local Law 49 was seen by many in the industry as a promising first step in mitigating algorithmic bias, and an opportunity to bring fairness and accountability to the use of such automated decision systems.  

Unfortunately, over a year later, the Task Force has failed to make meaningful progress to fulfill its mission.  First, although the law includes a definition of “automated decision system” (“ADS”) and the Task Force has held approximately 18 meetings, they have been unable to reach a consensus as to what technology even meets the definition of ADS.  Without agreement as to what systems even fall under the purview of the Task Force, the odds of the Task Force providing specific recommendations by this fall is unlikely.

In addition, Local Law 49 did not mandate that City agencies provide the Task Force with requested information and made the provision of requested information voluntary. The joint written testimony of Task Force members Julia Stoyanovich and Solon Barocas stated that despite numerous requests, as of April 4, 2019, the City had not identified even a single ADS or provided any information about ADS used by the City.

Task Force members have stressed that they will be unable to provide meaningful or credible recommendations as to how to address algorithmic discrimination if the City fails to provide requested information, and stated that their need for examples of ADS used by the City has been raised on numerous occasions. In addition, Task Force members noted that they had developed a promising methodology for obtaining relevant information about ADS from developers and operators, however, the City inexplicably directed the Task Force to abandon their use of this methodology. 

At the Task Force’s first public hearing on April 30, 2019, Task Force members declined to commit to providing recommendations specific to ADS currently used by the City and noted that the City has not been cooperative with the Task Force’s inquiries.  The Task Force will hold its second public hearing on May 30, 2019.

OTHER THOUGHT LEADERSHIP POSTS:

Beyond GDPR: How Brexit Affects Other Data Laws

By Dawn Ingley | Since the United Kingdom (UK) voted in June, 2016, to exit the European Union (i.e., “Brexit”), the question in many minds has been, “Whither GDPR?” After all, the UK was a substantial contributor to this legislation. The UK has offered assurances that that it intends to, in large part, harmonize its data protection laws with GDPR.

San Francisco Says The Eyes Don’t Have It: Setting Limits on Facial Recognition Technology

By Jennifer Thompson | On May 14, 2019, the San Francisco Board of Supervisors voted 8-1 to approve a proposal that will ban all city agencies, including law enforcement entities, from using facial recognition technologies in the performance of their duties.

NYC’s Task Force to Tackle Algorithmic Bias: A Study in Inertia

By Linda Henry | In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies.

U.S. Lawmakers Want Companies to Check their Bias

By Linda Henry | Although algorithms are often presumed to be objective and unbiased, technology companies are under increased scrutiny for alleged discriminatory practices related to their use of artificial intelligence.

The Weight of “GDPR Lite”

By Dawn Ingley | In June, 2018, California’s legislature took the first steps to ensure that the state’s approach to data privacy was trending more closely to the European Union’s General Data Protection Regulation (GDPR), the de facto global industry standard for data protection. Though legislators have acknowledged that further refinements to the California Consumer Privacy Act (CCPA) will be necessary in the coming months, its salient requirements are known.

The ABA’s Valentine’s Gift to Same-Sex Couples: Formal Opinion 458 Requires Judges to Perform Marriages

By Jennifer Thompson | On Valentine’s Day, the American Bar Association (ABA) Standing Committee on Ethics and Professional Responsibility issued Formal Opinion 485, entitled “Judges Performing Same-Sex Marriages,” stating that judges may not decline to perform marriages for couples of the same sex.

The Intersection of Artificial Intelligence and the Model Rules of Professional Conduct

By Linda Henry | Artificial intelligence is transforming the legal profession and attorneys are increasingly using AI-powered software to assist with a wide rage of tasks, ranging from due diligence review, issue spotting during the contract negotiation process and predicting case outcomes.

Follow the Leader: Will Congressional and Corporate Push for Federal Privacy Regulations Leave Some Technology Giants in the Dust?

By Dawn Ingley | On October 24, 2018, Apple CEO Tim Cook, one of the keynote speakers at the International Conference of Data Protection and Privacy Commissioners Conference, threw down the gauntlet when he assured an audience of data protection professionals that Apple fully supports a “GDPR-like” federal data privacy law in the United States.

Yes, Lawyers Too! ABA Formal Opinion 483 and the Affirmative Duty to Inform Clients of Data Breaches

By Jennifer Thompson | Developments in the rules and regulations governing data breaches happen as quickly as you can click through the headlines on your favorite news media site.  Now, the American Bar Association (“ABA”) has gotten in on the action and is mandating that attorneys notify current clients of real or substantially likely data breaches where confidential client information is or may be compromised.

GDPR Compliance and Blockchain: The French Data Protection Authority Offers Initial Guidance

By Linda Henry | The French Data Protection Authority (“CNIL”) recently became the first data protection authority to provide guidance as to how the European Union’s General Data Protection Regulation (“GDPR”) applies to blockchain.