Representation in Tech
Digital Divide
Algorithmic Unfairness
Predictive Policing
Biased Hiring Tools
100

This gender makes up roughly only 25% of the tech workforce.

What are women?

100

This term means the gap between people who have vs. don’t have internet access.

What is the digital divide?

100

This happens when AI reinforces racial or gender stereotypes.

What is algorithmic bias?

100

This is the name for AI tools used to predict where crime may happen.

What is predictive policing?

100

AI hiring tools analyze this type of document first.

What is a résumé?

200

This racial group makes up about 5% of U.S. tech jobs despite being 14% of population.

Who are Black Americans?

200

This U.S. population is most likely to lack broadband — rural or urban?

What are rural communities?

200

 AI’s facial recognition struggles most with this type of skin tone.

What are darker skin tones?

200

Predictive policing has been criticized for over-targeting this racial group.

Who are Black Americans?

200

Amazon scrapped its AI hiring tool because it was biased against whom?

Who are women?

300

These two racial groups combined make up less than 20% of major Silicon Valley tech employment.

Who are Black and Latino workers?

300

This 2020 global crisis exposed major digital access inequality for students.

What is the COVID-19 pandemic?

300

This popular AI tool was exposed for misidentifying Black people as criminals.

What is facial recognition?

300

This U.S. city first adopted and later abandoned predictive policing after backlash.

What is Chicago?

300

AI often unfairly prioritizes resumes from people who graduated from these top-tier schools.

What are Ivy League universities?

400

This company was criticized for having only 4% Black and Latino employees in 2023.

What is Google?

400

This continent has the lowest internet access globally.

What is Africa?

400

This AI hiring tool (by Amazon) rejected resumes with the word “women’s” in it.

What is Amazon’s AI recruiting tool?

400

This major civil rights organization called predictive policing “a digital stop-and-frisk.”

What is the ACLU?

400

AI hiring tools unintentionally reinforce this term — hiring people who look just like current employees.

What is homogenous hiring / mirror bias?

500

This Silicon Valley phrase refers to young white/Asian male dominance in tech culture.

What is “brogrammer culture”?

500

This extremely expensive type of internet is often the only option in remote areas.

What is satellite internet?

500

The root cause of most AI bias — biased ______.

What is biased data?

500

The name of the specific AI program that tried to predict crime in Florida and caused racial profiling.

What is COMPAS or PredPol?

500

This is the ethical process of fixing AI hiring discrimination before deployment.

What is algorithmic auditing / bias auditing?

M
e
n
u