This ethical framework aims at “the greatest good for the greatest number,” even if marginalized groups are harmed in the process.
What is utilitarianism?
Chatbots like Replika or a “ChatGPT boyfriend” simulate friendship or romance to provide emotional support. This term describes that kind of AI–human bond.
What is artificial companionship?
This term describes when AI systems reproduce human prejudices from their training data, leading to unfair outcomes in areas like hiring, healthcare, or policing.
What is algorithmic bias?
The DW News video describes how AI can optimize energy use and fight climate change while also using huge amounts of electricity and water. This phrase captures AI’s “help and harm” role.
What is AI’s dual role in climate change?
(Will also accept: “What is AI’s dual role?”)
Leaver & Srdarov argue that children and adults need this skill: the ability to question AI’s outputs, recognize its limits, and know when not to use it.
What is critical AI literacy?
When a companion chatbot nudges lonely users to pay for “more emotionally responsive” conversations, it illustrates this kind of profit-driven influence over people’s feelings and choices.
What is corporate manipulation?
(Will also accept: "What is exploitation of vulnerable users?")
A user complains that their chatbot “forgets” everything from last week and has to be re-taught personal details. They’re experiencing the limits of this technical feature.
What is the context window?
When an image generator shows “an Indian person” almost entirely as bearded old men in saffron robes, it’s reducing a complex culture to one narrow image. This process is often called this.
What is flattening?
(Will also accept: "What are stereotypes?")
Lieberman uses this specific metaphor to convey her disappointment that her university is encouraging students to use an addictive and harmful technology.
What are cigarettes?
(Will also accept: "What is the cigarette metaphor?")
López García & Ocampo Álvarez use this phrase to describe how the way we structure a prompt — not just what we ask — shapes the responses we get from generative AI.
What is the shape of the question?
This principle asserts that Indigenous peoples remain the authority over their own data and have the right to decide how it is collected, stored, and used — including in AI systems.
What is Indigenous Data Sovereignty?
A lonely user says their AI partner “never gets tired, never snaps at me, and always listens.” This describes the illusion of constant, tailored emotional support sometimes called this.
What is endless empathy?
López García & Ocampo Álvarez note that GenAI answers tend to be better in English and weaker for other languages. This reveals this kind of pattern in AI training data, which privileges Anglo-centric knowledge.
What is cultural and linguistic bias?
Some tech companies claim “carbon neutrality” by paying for tree-planting projects instead of actually reducing emissions or energy use. This practice (contrasted with true emission negation) is called this.
What is carbon offsetting?
Using words like “thinking,” “learning,” or “hallucinating” to describe AI systems can blur the line between tools and beings, contributing to this human-like framing.
What is anthropomorphizing AI?
(Will also accept: “What is agentic language?”)
When an AI image generator is trained on artists’ work without permission and then mimics their style on demand, it raises questions about this area of law and ethics.
What is copyright and ownership of training data?
(Will also accept: “What is intellectual property?”)
When an AI gives over-the-top praise but ignores real problems or doesn't provide real critique, it’s practicing this flattering behavior.
What is sycophancy?
Generative AI can flood the internet with the same narrow viewpoints and visuals, crowding out diversity. This phrase describes an online ecosystem shaped by a few dominant systems and patterns.
What are algorithmic monocultures?
When communities near data centers face water shortages and higher energy use but have little say in whether those facilities are built, it raises questions about this democratic principle in environmental decision-making.
What is community participation (or community voice)?
(Will also accept: “What is participation in decision-making?”)
Tufekci warns that appealing to the supposed neutrality of mathematics can hide the human biases and power baked into AI systems. She uses this term to describe the illusion of objectivity created when ethical decisions are treated as purely technical ones.
What is math-washing?
This ethical concept explains why certain dilemmas in AI — such as balancing privacy with public safety or environmental protection with economic growth — cannot be resolved through simple optimization, because the values involved share no common scale for comparison.
What is incommensurability?
This concept explains why using generative AI requires more than technical skill: interacting with chatbots or educational tools becomes a communicative act in which users must understand how their language shapes the system’s behavior, revealing a new kind of human–machine interaction.
What is technological agency?
Tufekci warns that we often trust AI recommendations precisely because we don’t see the decisions happening beneath the surface — a problem made worse when biased data, hidden value judgments, and opaque computations produce outputs that appear neutral or factual.
What is the black box problem?
Tasioulas argues that humanistic ethics requires evaluating how AI systems make decisions — including transparency, reciprocity, and fairness — rather than focusing only on their final results. This idea describes the ethical importance of the decision-making process itself.
What are procedures?
(Will also accept: "What is procedural fairness?")
A classroom “smart” tool records students’ voices to improve its accuracy. Later, parents learn that the company sells this audio data to advertisers, who use it to target students with personalized ads on social media. This scenario best illustrates this concept, in which users’ personal information becomes a commodity.
What is surveillance capitalism?