Analyzing the impact of missing values and selection bias on fairness
Categorical variable
Resampling
Real world data
DOI:
10.1007/s41060-021-00259-z
Publication Date:
2021-06-01T04:07:33Z
AUTHORS (2)
ABSTRACT
Abstract Algorithmic decision making is becoming more prevalent, increasingly impacting people’s daily lives. Recently, discussions have been emerging about the fairness of decisions made by machines. Researchers proposed different approaches for improving these algorithms. While can help machines make fairer decisions, they developed and validated on fairly clean data sets. Unfortunately, most real-world complexities that them dirty . This work considers two analyzing impact issues fairness—missing values selection bias—for categorical data. After formulating this problem showing its existence, we propose fixing algorithms sets containing missing and/or bias use forms reweighting resampling based upon value generation process. We conduct an extensive empirical evaluation both synthetic using various metrics, demonstrate how generated from mechanisms prediction fairness, even when accuracy remains constant.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (54)
CITATIONS (16)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....