Context
Case Studies
AI and Bias
Author
Surveillance
100

Institutions where impoverished individuals were housed and required to work under harsh conditions.

What is a poorhouse?

100

The city where the Coordinated Entry System ranks homeless individuals for housing.

What is Los Angeles?

100

These tools are often assumed to be neutral, but Eubanks shows they’re not.

What are algorithms?

100

Author of Automating Inequality.

Who is Virginia Eubanks?

100

Eubanks says this must be part of the design process if we want truly fair systems.

Lived experience or the voices of impacted communities

200

Term that Eubanks uses to describe modern systems that control the poor

What is the digital poorhouse?

200

The U.S. state where an automated welfare eligibility system caused mass denials of benefits due to technical errors and lack of human oversight.

What is Indiana?

200

The term for automated suspicion or scrutiny based on risk scores.

What is red-flagging?

200

Eubanks’ profession.

What is a political scientist?

200

This must be included in the design process to make systems fair.

What is human oversight?

300

This label was used to deny benefits in Indiana, even for simple application errors.

failure to cooperate

300

The name of the county where child maltreatment risk scores are used to help screen hotline calls.

What is Allegheny County?

300

A key flaw in using historical data to train decision-making systems.

What is bias replication?

300

Her call to action: systems should be designed with this value in mind.

What is human dignity?

300

Government agencies use this kind of card to monitor where benefits are spent.

What is an EBT card?

400

The historical movement that used data, “casework,” and eugenics to manage the poor.

scientific charity

400

This experiment in Indiana replaced in-person caseworkers with a privatized call center system.

 2006 eligibility modernization project

400

This group is most affected by algorithmic surveillance in public service systems.

Who are low-income or marginalized communities?

400

Eubanks argues AI doesn’t fix inequality—it does this instead.

What is automate it?

400

According to Eubanks, automation in welfare systems increases surveillance while removing this.

Human accountability

500

The concept that emerged in Chapter 5 to describe the invisible infrastructure of modern poverty control.

digital poorhouse

500

This error-prone approach resulted in over 1 million benefit denials in Indiana from 2007 to 2009.

automated eligibility processing

500

A key principle Eubanks says we must demand from AI in social programs.

What is transparency?

500

This personal experience sparked Eubanks’ urgency to explore digital injustice.

Her partner being attacked and then denied insurance coverage due to algorithmic red-flagging

500

Eubanks says the poor are not only over-surveilled but also lack this.

What is the right to privacy or agency?