AI rejects resumes with foreign-sounding names. Who is impacted?
Immigrant and Black job seekers face discrimination.
Automated grading penalizes non-standard English. Who is harmed?
Black students, immigrant students, and others using dialects or non-standard English.
AI underestimates pain in Black patients. Harm?
Delayed treatment, undertreatment, poorer health outcomes.
AI mortgage algorithms reject applications in majority-Black neighborhoods. Harm?
Limits access to homeownership, perpetuates segregation, reinforces wealth gaps.
AI funding allocation prioritizes wealthier neighborhoods. Who loses out?
Low-income and minority communities, perpetuating resource inequity.
AI ranks candidates lower based on tone or speech patterns. What bias is this?
Accent/language bias that disadvantages non-native speakers or minorities.
AI tutoring apps require monthly fees. Benefit and risk?
+Extra learning support, -Excludes low-income students, widening the digital divide.
AI diagnoses skin conditions less accurately on darker skin. Consequences?
Misdiagnosis, delayed care, worsened health disparities.
AI flags tenants with low credit scores. Who is affected?
Low-income renters, immigrants, and historically marginalized groups.
AI assigns fewer doctors to immigrant-heavy areas. Harm?
Reduced access to healthcare, increased disparities.
Automation replaces warehouse jobs in low-income areas. Positive & negative effects?
+Efficiency, -Job loss for marginalized workers, economic instability.
Predictive analytics label students “at risk.” Impact on marginalized students?
Risk of stigmatization, tracking, and limited access to advanced opportunities.
Health apps only support English instructions. Who is excluded?
Non-English speakers, immigrants, and refugees.
Predictive policing impacts neighborhood housing stability. How?
Increased evictions, lower property values, fear of displacement.
Disaster relief AI prioritizes urban areas. Marginalized communities affected?
Rural or low-income areas get insufficient aid.
Remote AI interviews misjudge accents. How can companies fix this?
Human oversight, diverse training datasets, inclusive interview criteria.
AI tools rarely represent disabled students. Solutions?
Inclusive software design, accessibility features, universal design for learning.
Predictive models divert resources from rural hospitals. Impact?
Rural, low-income communities get fewer services, increased inequities.
Rental chatbots don’t accommodate non-English speakers. Consequences?
Exclusion from housing opportunities, language-based discrimination.
AI predicts library closures in “low-use” areas. Impact?
Reduced educational and informational resources for marginalized groups.
Predictive hiring algorithms favor men for STEM roles. Long-term consequences?
Gender inequality, wage gaps, fewer opportunities for women and minorities.
VR/AI tools enhance learning but aren’t accessible. Equity implications?
Wealthier students benefit; low-income or disabled students are excluded.
AI favors patients with prior health records. Equity concerns?
People without consistent access to healthcare are disadvantaged.
AI favors landlords with more tech access. Equity issues?
Small or low-tech landlords may lose business; tenants in low-income areas affected.
AI assigns school funding based on predictive models. Equity implications?
Underfunded schools in marginalized communities; widening education gaps.