Mockup designed by Anagram Design

About the project

We are strategic designers Mike Lehmann, Marina Rost and Vera Schindler-Zins from the Hochschule für Gestaltung Schwäbisch Gmünd. The Coded Fairness toolkit was developed as part of our Master's thesis. Over the course of five months, we dealt intensively with the topic of "Biases in Machine Learning", talking to various experts from the field and receiving feedback, encouragement and validation from our great discussion partners. Thank you!

With this project, we hope to promote the discussion of biases within development teams working on machine learning systems. In addition, we hope to start a discourse about the necessary measures to reduce biases. At this point, we would also like to thank you for showing interest in this topic and for downloading the toolkit!

About us