How does a chatbot become sexist? According to Josie Young’s research, it’s usually incorporated into the chatbot’s design and amplified through day-to-day conversations with humans. From only responding to commands from male voices to tolerating abusive language, tech companies are incorporating regressive ideas and unconscious bias into AI. Not only is this a sign of lazy design, says Josie, but attempting to replicate ourselves in AI is stifling innovation and the potential of this technology. An advocate for designing AI products and systems using ethical and feminist principles, Josie argues that we should be using this technology to heal society’s problems, instead of adding to them. With this in mind, she has created a practical tool for teams to use when building a bot, prompting developers to question their own bias and training them to address these issues themselves in the future. Josie Young advocates for designing Artificial Intelligence (AI) products and systems using ethical and feminist principles. In 2017, she developed and tested a design process for building feminist chatbots. In 2018, Josie contributed to a research project with Charisma.AI and King’s College London to identify ways to deal with bias in large word libraries used for Natural Language Processing. Josie works in London at Methods, leading work to understand the most ethical and appropriate ways to deploy AI in the public sector. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at

Josie Young

Feminist AI Researcher

More from this speaker
Skip to content