An Optimized Multi-teacher Knowledge Distillation Method: Application to Early Diagnosis of Otitis Media (Preprint)
DOI:
10.2196/preprints.22690
Publication Date:
2020-07-25T01:30:47Z
AUTHORS (3)
ABSTRACT
<sec> <title>BACKGROUND</title> Otitis media (OM) is a common ear disease, which can induce hearing loss and even life-threatening. However, due to poor classification performance, insufficient data, high computational costs, OM cannot be diagnosed accurately. </sec> <title>OBJECTIVE</title> An optimized multi-teacher knowledge distillation method proposed realize the early diagnosis of otitis with data at lower cost. <title>METHODS</title> Based on ensemble learning conventional method, an proposed. The framework consists teacher network student network. responsible for from raw exporting prior knowledge, task. composed three components: VGG, ResNet, Inception. Each component could regarded as learn knowledge. identical lightweight CNNs (convolutional neural networks). CNN viewed obtain teachers execute First, separately Then, trained based learned teacher. This transfer process that compress reduce costs. Next, improve accuracy, predicted results well-trained students are fused two contrastive methods: voting-based fusion average-based method. Finally, model forms used validity verified tympanic membrane set. <title>RESULTS</title> achieves good performance in training accuracy reaches 99.02%, testing 97.38%, exceeds any Compared using task directly, time reduces by 64.37%, greatly shortens calculation time. Three deep large compressed into model, <title>CONCLUSIONS</title> suitable data. In addition, realizes compression
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....