NIR-Prompt: A Multi-task Generalized Neural Information Retrieval Training Framework
Domain Adaptation
Schema (genetic algorithms)
Decoupling (probability)
DOI:
10.1145/3626092
Publication Date:
2023-10-02T08:24:09Z
AUTHORS (4)
ABSTRACT
Information retrieval aims to find information that meets users’ needs from the corpus. Different correspond different IR tasks such as document retrieval, open-domain question answering, retrieval-based dialogue, and so on, while they share same schema estimate relationship between texts. It indicates a good model can generalize domains. However, previous studies indicate state-of-the-art neural (NIR) models, e.g., pre-trained language models (PLMs) are hard generalize. is mainly because end-to-end fine-tuning paradigm makes overemphasize task-specific signals domain biases but loses ability capture generalized essential signals. To address this problem, we propose novel NIR training framework named NIR-Prompt for reranking stages based on idea of decoupling signal capturing combination. exploits Essential Matching Module (EMM) matching gets description by Description (MDM). The used task-adaptation combine adapt tasks. Experiments under in-domain multi-task, out-of-domain new task adaptation settings show improve generalization PLMs in both compared with baselines.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (90)
CITATIONS (2)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....