Testing software for non-discrimination: an updated and extended audit in the Italian car insurance domain
Software Engineering (cs.SE)
FOS: Computer and information sciences
Computer Science - Software Engineering
Computer Science - Machine Learning
Artificial Intelligence (cs.AI)
Computer Science - Artificial Intelligence
algorithmic bias;fairness;software audit;empirical methods
Computer Science - Human-Computer Interaction
Human-Computer Interaction (cs.HC)
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.2502.06439
Publication Date:
2025-02-10
AUTHORS (8)
ABSTRACT
Context. As software systems become more integrated into society's infrastructure, the responsibility of professionals to ensure compliance with various non-functional requirements increases. These include security, safety, privacy, and, increasingly, non-discrimination. Motivation. Fairness in pricing algorithms grants equitable access basic services without discriminating on basis protected attributes. Method. We replicate a previous empirical study that used black box testing audit by Italian car insurance companies, accessible through popular online system. With respect study, we enlarged number tests and demographic variables under analysis. Results. Our work confirms extends findings, highlighting problematic permanence discrimination across time: significantly impact this day, birthplace remaining main discriminatory factor against individuals not born cities. also found driver profiles can determine quotes available user, denying equal opportunities all. Conclusion. The underscores importance for non-discrimination affect people's everyday lives. Performing algorithmic audits over time makes it possible evaluate evolution such algorithms. It demonstrates role engineering play making accountable.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....