Despite decades of awareness and action, discriminatory racial and gender bias in IT fields are pervasive. Currently, women only occupy 25% of high-tech positions, although they comprise 47% of the overall US workforce. Similarly, African Americans only fill 9% of computer technology positions, despite comprising 12% of the US workforce.
We are developing a conversational agent that plays the role of a female or minority worker who provides first-person feedback on biased language in written organizational communication. The agent not only provides didactic feedback on biased language, but gives its reaction to how the biased language could affect them personally. In a pilot study, we demonstrated that people who interacted with this agent were more motivated to take corrective action compared to those who received standard written educational materials on bias in organizational communication.