TURL: Table Understanding through Representation Learning
Relationship extraction
Table (database)
Benchmark (surveying)
Representation
DOI:
10.48550/arxiv.2006.14806
Publication Date:
2020-01-01
AUTHORS (5)
ABSTRACT
Relational tables on the Web store a vast amount of knowledge. Owing to wealth such tables, there has been tremendous progress variety tasks in area table understanding. However, existing work generally relies heavily-engineered task-specific features and model architectures. In this paper, we present TURL, novel framework that introduces pre-training/fine-tuning paradigm relational tables. During pre-training, our learns deep contextualized representations an unsupervised manner. Its universal design with pre-trained can be applied wide range minimal fine-tuning. Specifically, propose structure-aware Transformer encoder row-column structure new Masked Entity Recovery (MER) objective for pre-training capture semantics knowledge large-scale unlabeled data. We systematically evaluate TURL benchmark consisting 6 different understanding (e.g., relation extraction, cell filling). show generalizes well all substantially outperforms methods almost instances.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....