BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//RICAIP - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:RICAIP
X-ORIGINAL-URL:https://ricaip.eu
X-WR-CALDESC:Events for RICAIP
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Paris
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20250330T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20251026T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20260329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20261025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20270328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20271031T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Paris:20260317T130000
DTEND;TZID=Europe/Paris:20260317T140000
DTSTAMP:20260429T223244
CREATED:20260303T101616Z
LAST-MODIFIED:20260317T130634Z
UID:17771-1773752400-1773756000@ricaip.eu
SUMMARY:AI – smart but sexist? Pull the plug on gender bias
DESCRIPTION:RICAIP and DFKI in Saarbrücken\, Germany\, are pleased to invite you to an online RICAIP session on gender bias in AI. \n\n\n\nDiscrimination\, especially also gender bias\, are social problems. But what does it mean for the AI era of AI? What impact might it have in the future? \n\n\n\nThis talk sheds light on the role of gender bias\, diversity in AI and how AI systems become discriminating. The types of bias are discussed because\, if we know how they creep into AI and other software systems\, we can also do something about them. A framework is then presented to help us all work towards having „fair by design“ AI systems around us – both in our professional and personal lives. Last but not least\, some advice is given on how you can promote fairness within your sphere of influence.   \n\n\n\n\nRegister here\n\n\n\nSlides in the PDF\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nAgenda\n\n\n\n\nAI\, gender\, and the future\n\n\n\nWhere do we stand with AI and gender bias at present?\n\n\n\nBias types\n\n\n\nFramework for „fairness by design“\n\n\n\nKey points for „fairness by you“\n\n\n\n\n\n\n\n\n\n\nAbout the Speaker\n\n\n\nDr. Kinga Schumacher \n\n\n\nDr. Kinga Schumacher studied computer science in Mannheim and completed her Phd in the field of artificial intelligence at the University of Potsdam. She is a senior researcher and deputy head of the Cognitive Assistance Systems research group at the German Research Center for Artificial Intelligence (DFKI). She is also the head of the Diversity and Gender Equality working group at DFKI. Her research focuses on the areas of “diversity-aware AI“\, gendered research and human-machine interaction. In addition\, she is involved in mapping the AI landscape with regard to AI methods and the capabilities of AI systems\, supplemented with a risk assessment scheme based in the AI Act. This work flows into the regulatory activities of Germany and the EU.  \n\n\n\nRegistration\n\n\n\n\n\n  \n\n\n\n\n \n\n \n\n \n\n \nI have read and I agree to the processing of personal data as stated  HERE \n\n \n\n Δ
URL:https://ricaip.eu/events/ai-smart-but-sexist-pull-the-plug-on-gender-bias/
ATTACH;FMTTYPE=image/png:https://ricaip.eu/wp-content/uploads/2026/03/ZeMA-Training-2025-1.png
END:VEVENT
END:VCALENDAR