AI – smart but sexist? Pull the plug on gender bias
March 17 | 13:00 – 14:00 UTC+1
RICAIP and DFKI in Saarbrücken, Germany, are pleased to invite you to an online RICAIP session on gender bias in AI.
Discrimination, especially also gender bias, are social problems. But what does it mean for the AI era of AI? What impact might it have in the future?
This talk sheds light on the role of gender bias, diversity in AI and how AI systems become discriminating. The types of bias are discussed because, if we know how they creep into AI and other software systems, we can also do something about them. A framework is then presented to help us all work towards having „fair by design“ AI systems around us – both in our professional and personal lives. Last but not least, some advice is given on how you can promote fairness within your sphere of influence.

Agenda
- AI, gender, and the future
- Where do we stand with AI and gender bias at present?
- Bias types
- Framework for „fairness by design“
- Key points for „fairness by you“
About the Speaker
Dr. Kinga Schumacher
Dr. Kinga Schumacher studied computer science in Mannheim and completed her Phd in the field of artificial intelligence at the University of Potsdam. She is a senior researcher and deputy head of the Cognitive Assistance Systems research group at the German Research Center for Artificial Intelligence (DFKI). She is also the head of the Diversity and Gender Equality working group at DFKI. Her research focuses on the areas of “diversity-aware AI“, gendered research and human-machine interaction. In addition, she is involved in mapping the AI landscape with regard to AI methods and the capabilities of AI systems, supplemented with a risk assessment scheme based in the AI Act. This work flows into the regulatory activities of Germany and the EU.
