Learning Invariant Representations with Missing Data.
FOS: Computer and information sciences
Computer Science - Machine Learning
Statistics - Machine Learning
Machine Learning (stat.ML)
0101 mathematics
01 natural sciences
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.2112.00881
Publication Date:
2021-01-01
AUTHORS (7)
ABSTRACT
Spurious correlations allow flexible models to predict well during training but poorly on related test distributions. Recent work has shown that models that satisfy particular independencies involving correlation-inducing \textit{nuisance} variables have guarantees on their test performance. Enforcing such independencies requires nuisances to be observed during training. However, nuisances, such as demographics or image background labels, are often missing. Enforcing independence on just the observed data does not imply independence on the entire population. Here we derive \acrshort{mmd} estimators used for invariance objectives under missing nuisances. On simulations and clinical data, optimizing through these estimates achieves test performance similar to using estimators that make use of the full data.<br/>CLeaR (Causal Learning and Reasoning) 2022<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....