Jobs
EduKation
Healthcare
Housing
Community Resources
100

AI rejects resumes with foreign-sounding names. Who is impacted?

Immigrant and Black job seekers face discrimination.

100

Automated grading penalizes non-standard English. Who is harmed?

Black students, immigrant students, and others using dialects or non-standard English.

100

AI underestimates pain in Black patients. Harm?

Delayed treatment, undertreatment, poorer health outcomes.

100

AI mortgage algorithms reject applications in majority-Black neighborhoods. Harm?

Limits access to homeownership, perpetuates segregation, reinforces wealth gaps.

100

AI funding allocation prioritizes wealthier neighborhoods. Who loses out?

Low-income and minority communities, perpetuating resource inequity.

200

AI ranks candidates lower based on tone or speech patterns. What bias is this?

Accent/language bias that disadvantages non-native speakers or minorities.

200

AI tutoring apps require monthly fees. Benefit and risk?

+Extra learning support, -Excludes low-income students, widening the digital divide.

200

AI diagnoses skin conditions less accurately on darker skin. Consequences?

Misdiagnosis, delayed care, worsened health disparities.

200

AI flags tenants with low credit scores. Who is affected?

Low-income renters, immigrants, and historically marginalized groups.

200

AI assigns fewer doctors to immigrant-heavy areas. Harm?

Reduced access to healthcare, increased disparities.

300

Automation replaces warehouse jobs in low-income areas. Positive & negative effects?

+Efficiency, -Job loss for marginalized workers, economic instability.

300

Predictive analytics label students “at risk.” Impact on marginalized students?

Risk of stigmatization, tracking, and limited access to advanced opportunities.

300

Health apps only support English instructions. Who is excluded?

Non-English speakers, immigrants, and refugees.

300

Predictive policing impacts neighborhood housing stability. How?

Increased evictions, lower property values, fear of displacement.

300

Disaster relief AI prioritizes urban areas. Marginalized communities affected?

Rural or low-income areas get insufficient aid.

400

Remote AI interviews misjudge accents. How can companies fix this?

Human oversight, diverse training datasets, inclusive interview criteria.

400

AI tools rarely represent disabled students. Solutions?

Inclusive software design, accessibility features, universal design for learning.

400

Predictive models divert resources from rural hospitals. Impact?

Rural, low-income communities get fewer services, increased inequities.

400

Rental chatbots don’t accommodate non-English speakers. Consequences?

Exclusion from housing opportunities, language-based discrimination.

400

AI predicts library closures in “low-use” areas. Impact?

Reduced educational and informational resources for marginalized groups.

500

Predictive hiring algorithms favor men for STEM roles. Long-term consequences?

Gender inequality, wage gaps, fewer opportunities for women and minorities.

500

VR/AI tools enhance learning but aren’t accessible. Equity implications?

Wealthier students benefit; low-income or disabled students are excluded.

500

AI favors patients with prior health records. Equity concerns?

People without consistent access to healthcare are disadvantaged.

500

AI favors landlords with more tech access. Equity issues?

Small or low-tech landlords may lose business; tenants in low-income areas affected.

500

AI assigns school funding based on predictive models. Equity implications?

Underfunded schools in marginalized communities; widening education gaps.