« Back to Results

Race, Measurement, and Algorithmic Bias

Paper Session

Friday, Jan. 7, 2022 12:15 PM - 2:15 PM (EST)

Hosted By: American Economic Association & Committee on the Status of Minority Groups in the Economics Profession
  • Chair: Jose Manuel Fernandez, University of Louisville

Algorithmic Bias on Social Media

Amanda Agan
,
Rutgers University
Diag Davenport
,
University of Chicago
Jens Ludwig
,
University of Chicago
Sendhil Mullainathan
,
University of Chicago

Abstract

On most social media platforms, people are overwhelmed with content. Recommender systems are used to curate feeds, allowing users to prioritize attention to items we are most likely to like, in turn affect who we engage with, whose voice is heard more and eventually how networks evolve. These algorithmic rankings are created by training on users' own behavior, such as what they click or dwell on. As a result, implicit in all recommender systems, is the equation of user behavior and user preference. When it comes to discrimination and prejudice, however, a large body of psychology research shows, that people often behave contrary to their preference. For example, people show the greatest implicit bias for automatic, quick choices, those made without much consideration, which describes much of the click behavior these algorithms are trained on. We present a model of recommender systems where users have implicit biases against some groups; relative to their own preferences, they show additional bias when choices are more automatic. In this context, the presence of algorithmic rankings (relative to random or time-based ranking) does not just reflect user bias, but magnifies it. That is, the introduction of algorithms results in even more biased behavior than users on their own would have. Empirically, we present data from Facebook ranking algorithms. We find, in fact, sizable ingroup bias - favoritism towards one's own racial group relative to own preference. These biases appear for the rankings trained on behaviors that are more automatic, but disappear for those trained on less automatic behavior.

Measuring Declines in Disparity Gaps, with an Application to Health Insurance

Paul Goldsmith-Pinkham
,
Yale University
Karen Jiang
,
Yale University
Zirui Song
,
Harvard University
Jacob Wallace
,
Yale University

Abstract

We propose a method for reporting how program evaluations reduce gaps between groups, such as the gender or Black-white gap. We first show that the reduction in disparities between groups can be written as the appropriately weighted difference in conditional average treatment effects (CATE) for each group. Then, using a Kitagawa-Oaxaca-Blinder-style decomposition we highlight that these CATE can be decomposed into CATE driven by other observables (e.g. the "endowment" difference) and unexplained differences between groups (e.g. the "discrimination" difference). We argue that reporting the share driven by each component can be an important summary statistic for researchers interested in understanding group differences, since it separates the CATEs into things that are causally manipulable (income, education, etc.) into those that are not (race, gender). Finally, we apply this approach to study the impact of Medicare on American's access to health insurance.

Modelling Discrimination, Job Competition and Race: Locomotive Firemen, and the Railroad Industry 1880–1950 and Technological Advance

William E. Spriggs
,
Howard University and AFL-CIO

Abstract

The dominate model of discrimination is to assume either barriers to entry, based on pre-market factors like schooling or distance to job locations, or discrimination in the market is viewed as client, owner or worker-based discrimination. But, in the case of locomotive firemen in the late 19th and early 20th Centuries, as the importance of the railroad grew, presents a more complex model of race and labor market discrimination emerged. In the US South, Blacks played a dominate role because the job of fireman on a steam locomotive was dirty and dangerous and was a servant role to the locomotive’s engineer. Their numbers were too large for white workers, seeking to exclude Blacks, for white railroad owners to agree to their exclusion. However, outside the South, Blacks were effectively barred from the job. This paper explores this complex setting, and shows is relevance to understanding discrimination effects.

Natural Born Criminals: Eugenic Beliefs and the History of Risk Assessment

Robynn Cox
,
University of Southern California
Megan Stevenson
,
University of Virginia

Abstract

This paper provides a brief history of actuarial risk assessments in the criminal legal system (CLS) in order to understand how its history influences current use. We begin with an overview of preventive confinement, then discuss the eugenics roots of risk assessment, followed by a history of risk assessments in the CLS, and conclude with fi nal thoughts.

Discussant(s)
Hunt Allcott
,
Microsoft Research
Augustine Denteh
,
Tulane University
Robert Margo
,
Boston University
Morgan Williams Jr.
,
Barnard College
JEL Classifications
  • J1 - Demographic Economics
  • J7 - Labor Discrimination