S.T.A.N.D.
Upstanding
Online Harm
100

Use S.T.A.N.D. in a sentence.

example: I used the S.T.A.N.D. method to support my friends.

100

What is the definition of Upstander?

Someone who takes positive action when witnessing harmful actions.

100

Definition of online harm.

A negative experience caused by someone through technology that affects someone's safety, well-being, dignity, or privacy.

200

What do the T and A stand for in S.T.A.N.D.?

Take time to listen

and

Alert trusted adults

200
What should you do if your friend is getting beaten up or bullied?

Stand up for them or defend them.

200

What is a way people are harmed online?

Online messages, anonymous bullying, trolling, etc.

300

What do the A., N. and D stand for in S.T.A.N.D.?

Alert trusted adults

,

Note Evidence (Document)

and

Defend their privacy

300

How can I safely speak up or act when I see someone being treated unfairly?

You can report it to a teacher or trusted adult, or you can get involved if you feel safe.

300

How can you stand up for friends affected by online harm?

Report accounts, comfort your friends, and seek help.

400

What does the S in S.T.A.N.D. stand for?

Stay connected

400

What does 'upstander' behavior look like in our school?

Reporting to teachers (anonymously or not), standing up for friends, and helping people who are being bullied.

400

What are three specific actions a student should take if they are being harassed by a peer on a social media platform, and why should they document these actions?

  1. Block the perpetrator: This stops further direct communication from that user.
  2. Report the behavior: Utilize the platform’s built-in reporting tools to alert moderators.
  3. Document evidence: Take screenshots of harmful messages, posts, or images. 


  • Why Document: Saving evidence is crucial because it can be used to show to parents, school authorities, or even the police, and provides a record if the perpetrator attempts to delete the evidence.
500

Use two of the S.T.A.N.D. methods in a sentence.

Include note evidence, stay connected, take time to listen, defend privacy, and/or alert trusted adults.

500

You see your close friend (or a popular peer group) bullying someone, but if you speak up, you know your friend will turn on you, and you will become the next target. How do you act as an upstander without simply becoming the next victim, and is it worth the personal cost?

  • Do not join in: Start by refusing to laugh, applaud, or spread the rumors, which immediately reduces the "audience" effect.
  • Utilize "Distraction" (The 3 D's): Interrupt the situation without directly accusing the bully. For example, ask the victim to help you with a task, change the subject, or create a diversion that ends the bullying moment.
  • Support the Victim Privately: If confronting the bully is too risky, immediately pull the victim aside afterward to offer support, affirmation, and solidarity. This negates the isolation the bully intended.
  • Delegate: Report the behavior to a trusted authority (teacher, adults) if the situation is dangerous or persistent, using anonymous reporting if necessary.
500

How can social media platforms implement effective, automated, 24/7 content moderation to eliminate harmful content (such as cyberbullying, self-harm encouragement, or extremist propaganda) without violating user privacy, censoring legitimate speech, or restricting the right to seek information?

  • Process over Content: Instead of just trying to ban specific words, platforms should focus on how content is amplified or restricted by algorithms.
  • Human-in-the-Loop AI: Automated AI tools should be used for initial, rapid detection (for speed), but complex decisions and contextual nuances must be reviewed by trained human moderators to avoid over-censorship.
  • Contextual Moderation: Algorithms must be trained to differentiate between harmful content and discussions about that content (e.g., educational content about eating disorders vs. content promoting them).
  • Age-Appropriate Design: Rather than just scanning everything, platforms must design environments that default to high privacy for minors, reducing the visibility of potential harms to children without restricting adult users.
  • Transparency and Accountability: Platforms must be transparent about their moderation policies and provide users with mechanisms to appeal decisions, ensuring they do not become arbitrary censors.