Digital and Data Colonialism
Resistance and Alternatives
Surveillance and Control
Bias and Discrimination
100

Collecting and using data from humans in order to train algorithms, all for profit and power

What is Algorithmic Colonialism?

100

Unlike the organization Tierra Comun, the organization Masakhane in Africa aims to decolonize AI systems more specifically. T/F

False

100

True or False: Manually reviewing video surveillance is easier than using an algorithm to review it.

What is false?

100

What is algorithmic bias?

Systematic errors or prejudices in algorithmic decision-making processes that result in unfair treatment of certain individuals or groups.

200

When organizations claim ownership of and privatize the data that is produced by their users and citizens

What is Data Colonialism?

200

Building diverse AI development teams leads to safer, more fair AI technology. T/F

True

200

Doorbell cameras and traffic light cameras are an example of

What is surveillance?

200

Which company discontinued an AI-driven recruiting tool?

Amazon discontinued it after discovering biased outcomes that disadvantaged female candidates

300

The control and exploitation of digital technologies by powerful entities over marginalized groups/regions

What is digital colonialism?

300

Because of its inherent biases, AI cannot be used as a decolonizing tool. T/F

False

300

An example of something a surveillance algorithm can do

What is (one of the following) Facial recognition, Detection of a particular movement or motion , Detection of an object, License plate recognition?

300

How can biased training data contribute to algorithmic discrimination?

Biased training data can contribute to it by encoding historical biases and systemic inequalities, leading to unfair outcomes for certain groups.


400

These people were angered due to lack of specific consent and complete disregard for native tradition

Havuspati Tribe

400

Organization based in Latin America that works as a major hub for meetings around decolonizing data.

Tierra Comun

400

A negative aspect of surveillance in call centers

What is (one of the following) too much control, intense pressure, public shaming?

400

What is an example of algorithmic discrimination in hiring processes?

An example of it in hiring processes is when an AI recruiting tool systematically downgrades resumes containing terms associated with certain demographics, such as gender or ethnicity.

500

This broke Net Neutrality rules, resulting in it's ban


What is Facebook's "Free Basics" program?

500

Organization based in Africa that is currently developing natural language processing capabilities in multiple low-recoursed East African languages

Masakhane

500

Surveillance software can add this to an analysis of a call

What is positive or negative sentiment?

500

How can organizations address algorithmic bias?

Organizations can address it by improving data quality, promoting diversity in algorithmic development teams, and regularly auditing algorithms for bias.

M
e
n
u