Influence based explainability of brain tumors segmentation in multimodal Magnetic Resonance Imaging
Multimodal therapy
DOI:
10.48550/arxiv.2405.12222
Publication Date:
2024-04-05
AUTHORS (7)
ABSTRACT
In recent years Artificial Intelligence has emerged as a fundamental tool in medical applications. Despite this rapid development, deep neural networks remain black boxes that are difficult to explain, and represents major limitation for their use clinical practice. We focus on the segmentation of images task, where most explainability methods proposed so far provide visual explanation terms an input saliency map. The aim work is extend, implement test instead influence-based algorithm, TracIn, originally classification tasks, challenging problem, i.e., multiclass tumor brains multimodal Magnetic Resonance Imaging. verify faithfulness algorithm linking similarities latent representation network TracIn output. further capacity local global explanations, we suggest it can be adopted select relevant features used decision process. method generalizable all semantic tasks classes mutually exclusive, which standard framework these cases.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....