CRSLab: An Open-Source Toolkit for Building Conversational Recommender System
FOS: Computer and information sciences
Association (psychology)
Trust-Aware Recommender Systems
Epistemology
02 engineering and technology
Computer Science - Information Retrieval
Context-Aware Recommender Systems
Engineering
Artificial Intelligence
0202 electrical engineering, electronic engineering, information engineering
Recommender system
Natural Language Processing
Computer Science - Computation and Language
Architectural engineering
Content-Based Recommendation
Statistical Machine Translation and Natural Language Processing
Open source
Computer science
Programming language
FOS: Philosophy, ethics and religion
World Wide Web
Philosophy
Recommender System Technologies
Joint (building)
Collaborative Filtering
Computer Science
Physical Sciences
Computation and Language (cs.CL)
Software
Information Retrieval (cs.IR)
Information Systems
DOI:
10.18653/v1/2021.acl-demo.22
Publication Date:
2021-07-27T01:42:51Z
AUTHORS (8)
ABSTRACT
8 pages<br/>In recent years, conversational recommender system (CRS) has received much attention in the research community. However, existing studies on CRS vary in scenarios, goals and techniques, lacking unified, standardized implementation or comparison. To tackle this challenge, we propose an open-source CRS toolkit CRSLab, which provides a unified and extensible framework with highly-decoupled modules to develop CRSs. Based on this framework, we collect 6 commonly-used human-annotated CRS datasets and implement 18 models that include recent techniques such as graph neural network and pre-training models. Besides, our toolkit provides a series of automatic evaluation protocols and a human-machine interaction interface to test and compare different CRS methods. The project and documents are released at https://github.com/RUCAIBox/CRSLab.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (18)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....