Published on JD Supra on January 31, 2020
In December, 2017 the New York City Council passed Local Law 49, the first law in the country designed to address algorithmic bias and discrimination occurring as a result of algorithms used by City agencies. Local Law 49 created an Automated Decision Systems Task Force (“Task Force”) to monitor algorithms used by municipal agencies, and directed the Task Force to provide recommendations by the fall of 2019 as to how to make algorithms used by the City fairer and more transparent. Despite the Task Force’s daunting task of balancing the need for more transparency with respect to the City’s use of algorithms against the right of companies to protect their intellectual property, many people were hopeful that Local Law 49 would encourage other cities and states to acknowledge the problem of algorithmic bias.
During the course of 2019, it became apparent that the Task Force was facing significant challenges in its efforts to fulfill the Task Force’s mission. First, although the law included a definition of “automated decision system” (“ADS”), after 18 meetings, the Task Force was still unable to reach a consensus as to what technology even met the definition of ADS.
In addition, Local Law 49 did not mandate that City agencies provide the Task Force with requested information and made the provision of requested information voluntary. Over five months after formation of the Task Force, certain Task Force members expressed their frustration that the City had not yet identified even a single ADS or provided any information about ADS used by the City. Although the City did eventually provide five examples of ADS to the Task Force, the City did not ever provide a full list of ADS used by City agencies.
Task Force members also noted that they had developed a promising methodology for obtaining relevant information about ADS from developers and operators, however, the City inexplicably directed the Task Force to abandon their use of this methodology, further frustrating the Task Force’s efforts. Task Force members warned that they would be unable to provide meaningful or credible recommendations as to how to address algorithmic discrimination if the City failed to provide the requested information.
In November 2019, the Task Force released its final report. Not surprisingly, the report did not make specific recommendations, and only identified key principles where the Task Force was able to reach consensus. Although the report included many aspirational principles, the lack of specificity raises the question of the report’s true utility. At a very high level, the report offered the following principles as recommendations:
Build capacity for an effective, fair, and responsible approach to the City’s ADS
The Task Force recommended developing centralized resources within the City government to guide policy and assist City agencies in the development, implementation and use of ADS. In addition, the Task Force recommended developing specific guidelines and a method for determining the potential for an ADS tool to result in a disproportionate impact or harm. The report did not, however, provide any specifics regarding the substance of such guidelines or methods.
The report noted that the broad definition of ADS included in Local Law 49 was overly broad and made identification of tools and systems that should be considered ADS rather difficult. As a result, the Task Force recommended narrowing the definition to exclude tools that perform only ministerial function. In particular, the report recommended replacing the single definition of ADS with a series of questions or criteria that can evolve over time.
Broaden public discussion on ADS
The Task Force recommended providing opportunities to the public to learn about ADS that impact individuals, including through use of publicly accessible, digital platforms through which educational materials would be available. The report also cited the importance of involving impacted communities in discussions about specific ADS, however, the report did not include any guidance as to how this would be achieved.
Formalize ADS management functions
The report also recommended establishing a framework for reporting and publishing information related to ADS, including the development of a process for escalating urgent matters involving actual or suspected harm to an individual or community in relation to use of an ADS.
Shortly after the report was released, Mayor Bill de Blasio issued an executive order creating an Algorithms Management and Policy Officer who will serve as a centralized resource on algorithm policy and develop guidelines and best practices to assist City agencies in their use of algorithms. The new Officer will also be responsible for ensuring that algorithms used by the City to deliver services promote equity, fairness and accountability.
3705 Canyon Ridge Ct. NE | Atlanta, Georgia 30319
161 N. Clark Street | Ste. 1700 | Chicago IL 60601