Virginia Eubanks 

We created poverty. Algorithms won’t make that go away

In the absence of political will to tackle America’s growing economic crisis, hi-tech tools can only serve to automate and amplify existing inequalities
  
  

In this 28 October 2017 file photo, a homeless man takes food from a trash can in Los Angeles’ Skid Row area, home to the nation’s largest concentration of homeless people.
In this 28 October 2017 file photo, a homeless man takes food from a trash can in Los Angeles’ Skid Row area, home to the nation’s largest concentration of homeless people. Photograph: Jae C Hong/AP

We live in a climate of austerity. In the past few months alone, we have seen a federal budget that proposes rolling back support for low-income housing, an executive order attacking welfare programs, and a plan to create a nationwide electronic registry of poor and working-class people. In the context of shrinking support, which families are able to access their basic human needs – housing, food, and healthcare – and which are not? Increasingly it is algorithms – not humans – making that call.

Since 2010, I’ve crossed the country studying and writing about the impact of hi-tech tools on public service programs. In Indiana, I investigated an attempt to automate and privatize the state’s welfare eligibility processes. In Los Angeles county, I explored the coordinated entry system, a digital tool intended to match the most vulnerable unhoused people with the most appropriate available resources. And in Allegheny county, Pennsylvania, I studied a statistical model that is supposed to be able to predict which children might be victims of abuse or neglect in the future.

In each place I visited, policymakers, data scientists, and social workers told a remarkably consistent story: there is extraordinary need for public programs and not enough help to go around. The goal of automated decision-making, they told me, is to distribute limited resources more equitably, but also to help make the heartbreaking choices of whom among the most exploited and marginalized people in the United States will get help.

“We have extraordinary need [for housing] and can’t meet all of that need at once,” said Molly Rysman, the housing and homelessness deputy for Los Angeles county’s District 3, speaking in 2015 about coordinated entry. “So you’ve got to figure out: how do we get folks who are going to bleed to death access to a doctor, and folks who have the flu to wait? It’s unfortunate to have to do that, but it is the reality of what we’re stuck with.”

The result has been an explosion of digital tools for managing poverty – and for alleviating the uncomfortable feeling that we’re not doing enough to address economic suffering. Automated eligibility systems remove discretion from frontline caseworkers and replace welfare offices with online forms and privatized call centers. What seems like an effort to lower program barriers and remove human bias often has the opposite effect, blocking hundreds of thousands of people from receiving the services they deserve. In Indiana, for example, Omega Young of Evansville lost her Medicaid coverage when she failed to attend a telephone recertification appointment. She missed the call because she was in the hospital suffering from terminal cancer.

Algorithms act as moral thermometers, sifting survey data to rank unhoused people based on their perceived vulnerability. In the best-case scenario, this ensures that those most in need receive help more quickly. But because low-income housing is scarce, creating a spectrum of “deservingness” often means prioritizing those whose services are most cost-effective. Predictive models use statistics to predict which parents might maltreat their children. But the data that serves as their foundation is only collected on families that use public programs, leading to hi-tech risk detection systems that confuse parenting while poor with poor parenting. In Pittsburgh, the new Allegheny Family Screening Tool weighs 131 different variables available in the department of human services’ public service data warehouse – including whether a family receives Snap, support for depression, or county medical assistance – to decide which calls to the county’s abuse and neglect hotline should be screened in for child welfare investigation.

In other words, we are increasingly turning to digital tools to rank and rate which struggling families most deserve support. The trouble with this practice of hi-tech triage is that it treats social problems as if they are natural disasters – random, temporary, inevitable occurrences – obscuring the political choices that produce them. Take the housing crisis I witnessed firsthand during my reporting in Skid Row and South Los Angeles. There are 58,000 unhoused people in Los Angeles county alone, more than the figure for the countries of Sweden, Norway, Denmark, Finland and Iceland combined. The unhoused population of Los Angeles county has risen every year since 2014; last year, it increased 23%.

The housing crisis in Los Angeles was not – and is not – random, temporary or inevitable. In the 1950s, opponents blocked a plan to build 10,000 units of integrated low-income housing by reporting the housing authority to the House un-American activities committee. The attempt to shelter poor and working families was lambasted as an act of communism. In the 1960s, the city’s master plan demolished “blighted” neighborhoods without building replacement housing.

Homelessness is the product of austere policy. It is caused by inadequate supply of affordable housing, combined with stagnating incomes and a miserly public service system, which devotes far too many resources to diverting those in need from the assistance they are legally entitled to.

The good news about our collective responsibility for America’s housing crisis is this: if we created this catastrophe, we can fix it. Two simple, but politically challenging, remedies could go a long way towards stemming the growing tide of housing insecurity.

First, those of us who own homes could agree to forgo our mortgage interest tax deduction. Sociologist Matthew Desmond points out that as a nation we spent about $41bn in 2016 on housing supports for the needy. Yet we spent more than four times as much – $171bn – on homeowner tax subsidies that primarily benefit families making more than $100,000 a year. Pledging that money instead to creating and stabilizing low-income housing could solve homelessness in a single year.

Second, we could lower barriers to receiving social services and increase the payments for TANF, Snap, EITC and other means-tested benefits. Stagnating public benefits levels leave few options when a family faces a health emergency, housing crisis or work interruption. In Los Angeles, general relief payments have been stuck at $221 a month since 1982. In 2015, the Los Angeles Times reported that 13,000 people on public assistance fall into homelessness in the county every month, simply because their benefit levels are not keeping up with the cost of living.

In the absence of political will to tackle America’s growing economic crisis, hi-tech tools can only serve to automate and amplify existing inequalities. Yet we are doubling down on austerity instead of facing the root causes of extreme poverty head-on. Austerity fever requires a dedication to cutting the fat from already starving programs. Its systems engineering approaches replace our full panoply of social values – dignity, self-determination, equity, due process – with the twin imperatives of efficiency and cost savings.

The triage narrative absolves us of responsibility for our country’s unnatural disasters: the tragedies of malnutrition, homelessness, preventable death and family dissolution, by reinforcing the myth that extreme poverty is simply a fact of life – something to be managed rather than eradicated.

As the politics of austerity expands, automated decision-making systems act as empathy overrides, outsourcing inhuman choices about who survives and thrives, and who doesn’t. We empower machines to make these decisions because they are too difficult for us, because we know better. We know that there is no ethical way to prioritize one life over the next.

  • Virginia Eubanks is the author of Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor
 

Leave a Comment

Required fields are marked *

*

*