BOWLL: A Deceptively Simple Open World Lifelong Learner

FOS: Computer and information sciences Machine Learning (cs.LG)
DOI: 10.48550/arxiv.2402.04814 Publication Date: 2024-02-07
ABSTRACT
The quest to improve scalar performance numbers on predetermined benchmarks seems be deeply engraved in deep learning. However, the real world is seldom carefully curated and applications are limited excelling test sets. A practical system generally required recognize novel concepts, refrain from actively including uninformative data, retain previously acquired knowledge throughout its lifetime. Despite these key elements being rigorously researched individually, study of their conjunction, open lifelong learning, only a recent trend. To accelerate this multifaceted field's exploration, we introduce first monolithic much-needed baseline. Leveraging ubiquitous use batch normalization across neural networks, propose deceptively simple yet highly effective way repurpose standard models for Through extensive empirical evaluation, highlight why our approach should serve as future that able effectively maintain knowledge, selectively focus informative
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....