Training AIs to spot sexism

We are excited to share the Stepford sexism detection AI project. Stepford AI is an app that aims to detect and describe sexism against women in genre stories. It’s been generously funded by Mozilla Technology Foundation and all the AI code and corpuses created by the project will be made public at the end of the project. 

We need your help. In the first stage of our development we need to test and train the algorithm’s outputs, so that we can fine tune and improve it. Can you help us test it?

Help us by taking part in our open beta test.

More info about the Algowritten project and Mozilla Foundation

Algowritten has a close connection to Mozilla Foundation and its annual Mozilla Festival. In 2020/21, it was Trustworthy AI working group project and has since been funded by Mozilla Technology Fund. It has also presented at MozFest for the past two years. Last year at MozFest, the Algowritten group (which was part of MozFest’s Trustworthy AI working group) presented and discussed the Algowritten I collection of human and AI authored short stories. Through the collection, we explored the harmful biases that might arise in genre stories. We noticed sexist, racist and hetronormative language occurring during the creative writing process and we encouraged festival goers to leave comments and observations in the margins, which could form the basis for further discussion. This is a problem for creators of AI-based content.

At MozFest 2022, Stepford was presented as a way to automate the same kind of marginalia in texts submitted to it, by identifying sexist sections of text in a story segment and describing what is sexist about it. It uses GPT-3 to achieve this – the same AI system that produced our original Algowritten I collection. Our hope is that in future we can teach a version of GPT-3 to detect its own nuanced biases. Using this method to detect sexism against women is our first project. If successful, though, we hope the method can be used to identify other forms of harmful bias.

We hope you’ll get involved!

Learn more…