AI Opacity and Explainability in Tort Litigation
Redress
Lawsuit
Cause of action
Tort Reform
CLARITY
Product liability
DOI:
10.1145/3531146.3533084
Publication Date:
2022-06-20T14:27:10Z
AUTHORS (3)
ABSTRACT
A spate of recent accidents and a lawsuit involving Tesla's 'self-driving' cars highlights the growing need for meaningful accountability when harms are caused by AI systems. Tort (or civil liability) lawsuits one important way victims to redress such harms. The prospect tort liability may also prompt developers take better precautions against safety risks. claims all kinds will be hindered opacity: difficulty determining how why complex systems make decisions. We address this problem formulating evaluating several options mitigating opacity that combine expert evidence, legal argumentation, procedure, Explainable approaches. emphasise explanations in litigation attuned elements 'causes action' – specific facts must proven succeed lawsuit. Australian case explainable evidence as starting point from which map contemporary approaches tortious causes action, focusing on misleading conduct, negligence, product defects. Our work synthesizes law, computer science provide greater clarity opportunities challenges litigation, prove helpful potential litigants, courts, illuminate key targets regulatory intervention.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (118)
CITATIONS (6)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....