If you're seeing this message, it means we're having trouble loading external resources on our website.

If you're behind a web filter, please make sure that the domains *.kastatic.org and *.kasandbox.org are unblocked.

Main content

How AI bias impacts our lives: Lesson plan

What are the impacts of AI bias?
Common sense education

How AI bias impacts our lives: Lesson plan

GRADES 6–12
20 minutes
Just like people, artificial intelligence tools can make mistakes. That could mean simply mistaking tomatoes for apples. But the impacts of AI bias can often be much more serious. AI tools can end up recirculating harmful stereotypes and inequities within society. In this lesson, students will think critically about AI bias and how it affects the world.
A robot next to a pair of justice scales

Objectives

  • Understand that AI bias can impact people in different ways.
  • Reflect on the negative impacts of AI bias.

Vocabulary

  • AI bias – when an AI tool makes a decision that is wrong or problematic because it learned from training data that didn't treat all people, places, and things accurately
  • training data – the information given to an AI to help it learn how to do specific tasks

What you'll need

Step by step

  1. Ask: What is AI bias? How does it happen? (Slide 4)
  2. Project Slides 5–6 and explain that AI bias happens when an AI tool makes a decision that is wrong or problematic because it learned from training data that didn't treat all people, places, and things accurately. And training data is the information given to an AI to help it learn how to do a specific task.
  3. Say: AI bias impacts how reliable, fair, and trustworthy AI tools are. And it can also have an impact on individuals or groups of people, even if they didn't choose to use the tool in the first place. Let's take a look at a scenario (Slide 7).
  4. Distribute the Trick-or-Treat AI student handout and read through the scenario (Slides 8–9).
  5. Explain to students that the candy dispenser is operating with AI bias, and that they are going to have a chance to reflect on how that AI bias is impacting others.
Ask students to complete the questions with a partner, and then invite students to share out. Use Slides 10–18 to guide the discussion, or refer to the Teacher Version of the handout.
  1. Project Slide 19 and have students reflect on what people can do if they notice or experience AI bias.
  2. Once Ms. Igwe learns about this issue, what should she do?
  3. What can the creators of the candy dispenser do to make their product fairer for everyone?
  4. Say: Reporting AI bias can help companies improve their products. In this case, the creators of the candy dispenser could work on adding more training data so that it could better identify as many different kinds of Halloween costumes as possible (Slide 20).
  5. Ask: AI is trained on real-world data that people give it, and if that data contains biases (or is incomplete), the AI can end up being biased too. What are some of the negative impacts and consequences of AI bias? (Slide 21)
Invite students to respond, then project Slide 22 and review some of the negative impacts.
  • Unfair treatment: If an AI tool is biased, it might make decisions that are unfair to certain groups of people
  • Continuing stereotypes: If an AI tool learns from data that includes stereotypes (e.g. race or gender), it might make decisions that are based on those prejudiced ideas.
  • Unequal opportunities: AI bias can also limit opportunities for some people by unfairly favoring another group.
  • Misinformation: If an AI tool learns from biased information, it can end up creating and spreading false or incomplete information.
If time permits, this is a great opportunity to share and discuss a real-world example of the negative impacts of AI bias.
  1. Say: Knowing about AI bias can help us think critically and act responsibly if and when we use AI. If we notice AI bias having a negative impact on someone or something, we can help by reporting it to the company (Slide 23).

Want to join the conversation?

No posts yet.