Addressing AI Biases Against Disabled Job Seekers
ChatGPT is biased against resumes with credentials that imply a disability — but it can improve 🔗
The study conducted by researchers at the University of Washington found that the AI tool ChatGPT exhibited biases against resumes with disability-related credentials, ranking them lower and providing biased explanations for the rankings. However, when the tool was customized with instructions to avoid ableist biases, it showed improvements in reducing bias for most disabilities tested. The researchers highlighted the need for awareness of AI biases in real-world tasks and the importance of studying and addressing biases to ensure equitable and fair technology implementation. The study also emphasized the ongoing work by organizations to improve outcomes for disabled job seekers and the need for further research to document and remedy AI biases.
- ChatGPT exhibited biases against resumes with disability-related credentials, ranking them lower and providing biased explanations for the rankings.
- Customizing the tool with instructions to avoid ableist biases resulted in improvements in reducing bias for most disabilities tested.
- The study highlighted the importance of awareness of AI biases in real-world tasks and the ongoing work by organizations to improve outcomes for disabled job seekers.
- Further research is needed to document and remedy AI biases, including testing other AI systems, studying the intersections of biases with other attributes, and exploring further customization to reduce biases consistently.