About | HeinOnline Law Journal Library | HeinOnline Law Journal Library | HeinOnline

59 Ct. Rev. 10 (2023)
The Dilemma of Black Coding: Assessing Algorithmic Discrimination Legislation in the United States

handle is hein.journals/ctrev59 and id is 10 raw text is: 




The Dilemma of Black Coding:

            Assessing Algorithmic Discrimination Legislation

                                        in   the United States
                                                     Clarence Okoh


The proliferation   of algorithmic harms has become one of the
     most urgent civil and human rights challenges in the United
     States. Emerging technologies such as artificial intelligence,
machine  learning and similar algorithmic-driven technologies are
used to regulate access to essential goods and services including
housing, employment,  healthcare, public benefits, financial ser-
vices, and education.' These technologies are also routinely used
to influence judicial decision-making, expand  police surveil-
lance, determine  parental rights, and  facilitate immigration
enforcement among  other rights-impacting activities.2 In each of
these domains, the introduction of algorithmic technologies have
facilitated systemic civil and human rights abuses that are often
focused on  Black communities  and other racially marginalized
groups.3 It is increasingly clear that the concentrated impact of
algorithmic harms on racially marginalized communities threat-
ens to erode essential democratic values and norms by reinforc-
ing historic patterns of racial hierarchy through systemic civil
and human   rights violations.4
   In recent years, algorithmic harms have captured the attention
of lawmakers in the United States interested in developing com-
prehensive legal protections to address these challenges. Nascent
reform efforts have largely coalesced around a legislative frame-
work  that centers algorithmic auditing - a periodic technical
evaluation of algorithmic systems by private entities to determine
whether the technology facilitates adverse impact However, the
challenge is that algorithmic auditing is a relatively new and
largely unregulated field which relies on technical methods that
often fail to holistically assess the full spectrum of racial justice
implications of emerging technologies. In fact, in some instances,
algorithmic auditing can reinforce racialized harms through the
privatization of civil and human rights law enforcement.
   More  specifically, audit-centric legislative reforms can be a
poor  fit to address a challenge that this paper will refer to as
black coding. Black coding  is a term I use to refer to any
sociotechnical practice where algorithmic technologies are used
to evade, displace, or erode existing civil and human rights pro-


tections in  ways  that structure racial hierarchy and  social
inequality Black coding is present when public and private enti-
ties rely on algorithmic decision-making systems to determine
the legal rights, status, and freedoms of marginalized communi-
ties in ways  that contravene existing civil and human  rights
norms.  In this sense, black coding fits within a broader historical
pattern where  the  absence of rigorous public  oversight and
democratic  governance  facilitates the erosion of fundamental
rights for Black people and other racially marginalized commu-
nities in the United States. Left unchecked, the cumulative effect
of black coding can result in a collection of practices that effec-
tively return racially marginalized communities to second-class
citizenship in American life.
   This article proceeds in three parts. Part I provides a general
overview  of nascent legislative proposals designed by U.S. law-
makers  to address algorithmic discrimination. Part II explores
black coding  in more  detail by comparing  these practices to
19th Black Codes to demonstrate the insufficiencies of audit-cen-
tric legislative proposals. Part III concludes by outlining the fol-
lowing  three guiding principles to aid policymakers in crafting
solutions that address black coding, algorithmic discrimination,
and other racialized algorithmic harms. First, emerging technolo-
gies used in the context of fundamental rights should be subject
to  agency  preauthorization requirements. A preauthorization
regime would  require that covered entities receive public autho-
rization prior to deployment of any algorithmic technology used
in the context of fundamental rights and freedoms. Second, pro-
posals should adopt global standards and best practices related to
algorithm  fairness and  antidiscrimination. Third, reparative
justice and transformative justice should be integrated into legisla-
tive remedial schemes to address the systemic impact of racial-
ized algorithmic harms.

NASCENT LEGISLATIVE RESPONSES TO ALGORITHMIC
DISCRIMINATION IN THE UNITED STATES
   In recent years, policymakers and advocates have led nascent


Footnotes
1. See e.g., Julia Angwin et. al., Machine Bias, ProPublica (23 May
   2016),    https://www.propublica.org/article/machine-bias-risk-
   assessments-in-criminal-sentencing; Manish Raghavan & Solon
   Barocas,  Challenges for  mitigating bias  in  algorithmic
   hiring, Brookings (6 December 2019), https://www.brookings.edu/
   research/challenges-for-mitigating-bias-in-algorithmic-hiring/; Kate
   Crawford   et. al., AI  NOW     2019  Report,  Al  NOW
   Institute (December 2019), https://ainowinstitute.org/AINow_
   2019_Report.pdf
2. See e.g., Kesha Moore, Pretrial Justice Without Money Bail or Risk
   Assessments,  Thurgood Marshall  Institute   (January  2022),
   https ://tminstituteldf. org/wp-content/uploads/2 022/01/
   TMIPretrialJusticeWithoutCBorRA2.pdf
3. See e.g., Data Capitalism and Algorithmic Racism, Demos (17 May
   2021),    https://www. demos. org/research/data-capitalism-and-


   algorithmic-racism; Automating Banishment: The Surveillance and
   Policing of Looted Land STOP LAPD Spying Coaltion, (November
   2021),  https://automatingbanishment.org/; #NoTechForICE,
   NotechForICE    (last   accessed    March     1,    2022),
   https://notechforice. cor/
4. See e.g., Safiya Umoja Noble, Algorithms of Oppression (NYU Press
   2018); Ruha Benjamin, Race After Technology (Polity Press 2019);
   Rashida Richardson, Jason Schultz, Kate Crawford Dirty Data, Bad
   Predictions: How Civil Rights Violations Impact Police Data, Predic-
   tive Policing Systems, and Justice, New York University Law Review,
   Vol. 94, No. 15 (May 2019).
5. Mona  Sloane, The Algorithmic Auditing Trap, OneZero (17 March
   2021), https://onezero.medium.con/the-algorithmic-auditing-trap-
   9a6f2d4d461d; Inioluwa et. al., Closing the AT Accountability Gap:
   Defining an End-to-End Framework for Internal Algorithmic Audit-
   ing, Arixiv:200l.00973 [cs.CYI (3 January 2020).


10  Court Review - Volume 59


I                   This article was originally published by Transatlantic Policy Quarterly, and is reprinted with its hind permission.

What Is HeinOnline?

HeinOnline is a subscription-based resource containing thousands of academic and legal journals from inception; complete coverage of government documents such as U.S. Statutes at Large, U.S. Code, Federal Register, Code of Federal Regulations, U.S. Reports, and much more. Documents are image-based, fully searchable PDFs with the authority of print combined with the accessibility of a user-friendly and powerful database. For more information, request a quote or trial for your organization below.



Short-term subscription options include 24 hours, 48 hours, or 1 week to HeinOnline.

Contact us for annual subscription options:

Already a HeinOnline Subscriber?

profiles profiles most