XONN: XNOR-based Oblivious Deep Neural Network Inference
XNOR gate
Deep Neural Networks
DOI:
10.48550/arxiv.1902.07342
Publication Date:
2019-01-01
AUTHORS (6)
ABSTRACT
Advancements in deep learning enable cloud servers to provide inference-as-a-service for clients. In this scenario, clients send their raw data the server run model and back results. One standing challenge setting is ensure privacy of clients' sensitive data. Oblivious inference task running neural network on client's input without disclosing or result server. This paper introduces XONN, a novel end-to-end framework based Yao's Garbled Circuits (GC) protocol, that provides paradigm shift conceptual practical realization oblivious inference. costly matrix-multiplication operations are replaced with XNOR essentially free GC. We further algorithm customizes such runtime GC protocol minimized sacrificing accuracy. design user-friendly high-level API allowing expression architecture an unprecedented level abstraction. Extensive proof-of-concept evaluation various architectures demonstrates XONN outperforms prior art as Gazelle (USENIX Security'18) by up 7x, MiniONN (ACM CCS'17) 93x, SecureML (IEEE S&P'17) 37x. State-of-the-art frameworks require one round interaction between client each layer network, whereas, requires constant interactions any number layers model. first perform Fitnet 21 layers, suggesting new scalability compared state-of-the-art. Moreover, we evaluate four datasets privacy-preserving medical diagnosis.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....