In mid-December, members of the New York City Council voted unanimously to introduce municipal rules that would essentially hold algorithms accountable for their coding and actions. Although NYC Mayor Bill de Blasio has not yet signed off on the resolution, he is widely expected to do so.
According to a report published by investigative news project ProPublica, this is the first instance of a public measure that intends to regulate algorithms, digital sets of rules that are coded for research or problem solving purposes. Advanced algorithms are widely used by tech firms such as Facebook, Google and Uber; the NYC Council intends to use them in various public projects to improve matters related to urban life.
The council member who introduced the proposal was inspired to do so after learning about algorithms that had been coded with rules that could exclude ethnic groups. The issue came to light after an investigation by ProPublica and the New York Times revealed that the algorithms used in forensic DNA analysis were coded with bias against racial minorities.
Once Mayor de Blasio signs the resolution a task force of compliance officers and data scientist will be responsible for evaluating the various algorithms currently in use by the NYC municipal departments; the goal is to establish transparency and to detect situations that may be tantamount to bias and discrimination.
Following the aforementioned investigative reporting by ProPublica, a federal judge ordered the NYC medical examiner and forensic investigative department to disclose the source code of the algorithm, which apparently had a 30 percent margin of error. Shortly after the code disclosure, NYC officials quickly moved to abandon the algorithm since it could not be reasonably used as a forensic tool in delicate judicial cases such as murder trials.
In the world of criminal prosecutions, the use of algorithms to build profiles of defendants is not only risky but also dangerous when legal parties are left in the dark with regard to the coded rules.
One of the problems discussed by NYC council members is that algorithms are often sold to municipal departments without a thorough review of the source code.
At NYC jails, algorithms are used to predict if suspects are likely to conduct future crimes; these assessments are then used for surveillance and post-release supervision, but the ProPublica investigation found that algorithms have been coded with a certain bias against immigrants, socioeconomic groups and ethnicity