le 27 février 2020
Publié le 12 février 2021 Mis à jour le 12 juillet 2022

Guest Lecture : Vassilis Christophides


Data Ethics in Algorithmic Decision Making

Vassilis Christophides professeur en informatique à l'Université de Crète, actuellement fellow-in-residence à CY AS, invité du laboratoire ETIS, donnera une conférence sur le thème "Data Ethics in Algorithmic Decision Making"

Machine Learning (ML) algorithms typically operate by learning patterns in available data and generalizing them to unseen data. There is growing recognition that even ML models developed with the best of intentions may exhibit discriminatory biases, perpetuate inequality, or perform less well for historically disadvantaged groups. Harms to particular individuals or groups are essentially caused by “biased data” a notion that encompass many forms of “bugs” specific to data-driven decision systems. In this talk, we are presenting statistics and causal analytics approaches to unveil discrimination practices in high-stake applications like criminal justice and predictive policing, credit-worthiness and loans, etc. We survey current research efforts on ethical ML balancing between fairness and accuracy of predictions. We acknowledge the fact that “biased data” are often due to various imperfections arising during data collection or data processing.  Then, we are advocating the need to detect, report and prevent data ethics issues at the earliest possible stage of the pipelines used to build ML models. This call for tools to diagnose whether a given fairness issue might be addressed by collecting more training data from a particular subpopulation or by better cleaning existing training data and to predict how much more data are need to gather or to repair. We conclude with few preliminary ideas on how we can actively guide upstream data cleaning to jointly optimize fairness and accuracy of downstream ML models.

Télécharger son CV

Date : jeudi 27 février 2020 de 12h30 à 14h00

Lieu : Maison internationale de la recherche, Neuville-sur-Oise.