Transparency
Explainability
Contestability
100

This is defined as “openness or access, with the goal of becoming informed about the system.”

What is transparency?

100

This is the response to limits of transparency—helping humans understand decisions, not just see them.

What is explainability?

100

This is defined as the ability to question, challenge, or dispute algorithmic decisions.

What is contestability?

200

This perspective informs users that a system exists and why it collects data.

What is the scope perspective?

200

This keeps humans “in the loop” as an interface between people and machines.

What is the role of explainability?

200

Contestability moves users from passive recipients to this role.

What are active participants?

300

This approach asks, “How was this decision reached?” by focusing on processes that turn inputs into outputs.

What is the decision-rules approach?

300

From Miller's note: explanations should be causal, contrastive, selective, and ______.

What is social?

300

The authors say this should be a built-in design feature, not just an after-the-fact appeal.

What is contestability by design?

400

These laws on the slide illustrate scope transparency for federal systems.

What are the Privacy Act of 1974 and the E-Government Act §208?

400

Define this: “Why this outcome instead of that one?”—as a better fit for classification.

What are contrastive explanations?

400

Contestability turns decision-making into a collaborative process between these two parties.

What are humans and machines?

500

This credit law’s “key reasons” requirement is your example of output transparency.

What is the Equal Credit Opportunity Act?

500

We have been warned these can mislead because ML finds correlations, not true causes.

What are purely causal explanations?

500

Name these challenges: let users “trace predictions all the way down” and record disagreements.

What are heightened legibility and mechanisms to capture challenges?



M
e
n
u