Margaret Hu (Washington & Lee) on “Algorithmic Jim Crow", Disparate Impact Relating to Security and Identification Vetting...

Margaret Hu, “Algorithmic Jim Crow", 86 Fordham Law Review 633 (2017)

https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3071791

Margaret Hu (Washington & Lee)

Margaret Hu (Washington & Lee)

This Article contends that current immigration- and security-related vetting protocols risk promulgating an algorithmically driven form of Jim Crow. Under the “separate but equal” discrimination of a historic Jim Crow regime, state laws required mandatory separation and discrimination on the front end, while purportedly establishing equality on the back end. In contrast, an Algorithmic Jim Crow regime allows for “equal but separate” discrimination. Under Algorithmic Jim Crow, equal vetting and database screening of all citizens and noncitizens will make it appear that fairness and equality principles are preserved on the front end. Algorithmic Jim Crow, however, will enable discrimination on the back end in the form of designing, interpreting, and acting upon vetting and screening systems in ways that result in a disparate impact.

Currently, security-related vetting protocols often begin with an algorithm-anchored technique of biometric identification--for example, the collection and database screening of scanned fingerprints and irises, digital photographs for facial recognition technology, and DNA. Immigration reform efforts, however, call for the biometric data collection of the entire citizenry in the United States to enhance border security efforts and to increase the accuracy of the algorithmic screening process. Newly *634 developed big data vetting tools fuse biometric data with biographic data and internet and social media profiling to algorithmically assess risk.

This Article concludes that those individuals and groups disparately impacted by mandatory vetting and screening protocols will largely fall within traditional classifications--race, color, ethnicity, national origin, gender, and religion. Disparate-impact consequences may survive judicial review if based upon threat risk assessments, terroristic classifications, data-screening results deemed suspect, and characteristics establishing anomalous data and perceived foreignness or dangerousness data--nonprotected categories that fall outside of the current equal protection framework. Thus, Algorithmic Jim Crow will require an evolution of equality law.

INTRODUCTION 634

I.             BIRTH OF ALGORITHMIC JIM CROW         645

II.           OVERVIEW OF JIM CROW: CLASSIFICATION AND SCREENING SYSTEMS      650

A. Historical Framing of Jim Crow              651

B. Classification and Screening    654

C. Cyberarchitecture of Algorithmic Jim Crow      658

III.          THEORETICAL EQUALITY UNDER ALGORITHMIC JIM CROW            663

A. Limitations of Equal Protection as a Legal Response to Algorithmic Jim Crow    663

B. No Fly List and Discrimination on the Back End of Vetting and Database Screening Protocols               668

IV.          FUTURE OF ALGORITHMIC JIM CROW      671

A. Biometric Credentialing and Vetting Protocols: NASA v. Nelson              672

B. Delegating Vetting and Database Screening Protocols to States and Private Entities       679

C. Litigating Algorithmic Jim Crow             688

CONCLUSION     694

 

This Article aims to explain how big data vetting is mistakenly presented as a procedure that is restricted to noncitizens: immigrants, refugee and asylum applicants, and visitors seeking a travel visa to the United States. Instead, such vetting is part of a web of technologies that DHS has termed “identity management.” The application of these technologies may eventually extend to the entire citizenry through a variety of policy proposals, including a biometric national identification system, and various mandatory vetting and database screening programs. Identity-management programs attempt to authenticate identity and assess the risk factors across entire populations, including the U.S. citizenry. Big data vetting, thus, is misunderstood as a protocol that is likely to be limited to immigration-related screening. More accurately, such vetting includes an evolving form of big data surveillance that attempts to assess criminal and terroristic risk across entire populations and subpopulations through mass data collection, database screening and data fusion, artificial intelligence, and algorithm-driven predictive analytics.…

 

Jim Crow laws utilized screening systems to enforce segregation based upon designated racial classification. This discussion explores why security threat assessments produced through algorithms and database screening may--similar to historic Jim Crow--also separate populations based upon particular classifications. New Algorithmic Jim Crow systems, like historic Jim Crow regimes, systems may present themselves as facially neutral…

 

...just as historic Jim Crow regimes delegated segregationist gatekeeping duties to state and private entities, contemporary immigration policy delegates restrictive immigration gatekeeping duties to state and private entities. Under Algorithmic Jim Crow, these technologically enabled gatekeeping duties involve race-neutral database screening of personally identifiable data and biometric data through federal vetting and screening protocols. The results, however, may not be race neutral, or may in fact have a disparate impact on traditionally protected classes…

 

This Article concludes that current algorithm-driven vetting and biometric-biographic identification screening, especially once deployed across the entire citizenry, will likely lead to discriminatory profiling and surveillance on the basis of suspicious data as well as classification-based discrimination. These vetting and screening systems are likely to result in both direct and disparate discrimination, particularly based on race, color, ethnicity, national origin, and religion. In addition, recent immigration-control policy and programs demonstrate the government's interest in delegating immigration-vetting duties to private actors, such as employers, and nonfederal actors, such as state and local law enforcement, which can exacerbate issues of racial profiling and discrimination.