Scalable Vertical Federated Learning via Data Augmentation and Amortized Inference
Amortized analysis
Federated Learning
DOI:
10.48550/arxiv.2405.04043
Publication Date:
2024-05-07
AUTHORS (4)
ABSTRACT
Vertical federated learning (VFL) has emerged as a paradigm for collaborative model estimation across multiple clients, each holding distinct set of covariates. This paper introduces the first comprehensive framework fitting Bayesian models in VFL setting. We propose novel approach that leverages data augmentation techniques to transform problems into form compatible with existing algorithms. present an innovative formulation specific scenarios where joint likelihood factorizes product client-specific likelihoods. To mitigate dimensionality challenge posed by augmentation, which scales number observations and we develop factorized amortized variational approximation achieves scalability independent observations. showcase efficacy our through extensive numerical experiments on logistic regression, multilevel hierarchical split neural net model. Our work paves way privacy-preserving, decentralized inference vertically partitioned scenarios, opening up new avenues research applications various domains.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....