Page 59 - AI & Machine Learning for Beginners: A Guided Workbook
P. 59
Human annotators may subconsciously favor certain patterns,
leading to skewed results when the AI learns from this labeled
information.
Mitigating AI Bias
To reduce bias, AI developers must:
Use diverse and representative training datasets
Audit algorithms for unintended biases
Improve transparency and accountability in AI decision-
making
Activity: Identifying Potential Bias in AI Hiring
Scenario: A company develops an AI tool to screen job
applications. The AI is trained on data from the company’s past
successful hires, who are predominantly male.
Potential Biases in this AI System
Gender Bias: The AI might favor male candidates based on
historical hiring patterns.
Lack of Diversity: If the system learns from past hires, it may
unfairly disadvantage women and underrepresented groups.
Skill & Experience Filtering: If past hires had similar
backgrounds, the AI may prioritize familiar profiles rather than
evaluating all applicants fairly.
57

