Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction
| dc.contributor.author | Papaluca Andrea | |
| dc.contributor.author | Krefl Daniel | |
| dc.contributor.author | Suominen Hanna | |
| dc.contributor.author | Lenskiy Artem | |
| dc.contributor.organization | fi=tietotekniikan laitos|en=Department of Computing| | |
| dc.contributor.organization-code | 1.2.246.10.2458963.20.85312822902 | |
| dc.converis.publication-id | 176250533 | |
| dc.converis.url | https://research.utu.fi/converis/portal/Publication/176250533 | |
| dc.date.accessioned | 2022-10-28T13:55:20Z | |
| dc.date.available | 2022-10-28T13:55:20Z | |
| dc.description.abstract | In this work we put forward to combine pre-trained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pre-trained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we observe for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples. | |
| dc.format.pagerange | 373 | |
| dc.format.pagerange | 382 | |
| dc.identifier.isbn | 978-1-955917-23-0 | |
| dc.identifier.olddbid | 185196 | |
| dc.identifier.oldhandle | 10024/168290 | |
| dc.identifier.uri | https://www.utupub.fi/handle/11111/42040 | |
| dc.identifier.url | https://aclanthology.org/2022.acl-srw.29/ | |
| dc.identifier.urn | URN:NBN:fi-fe2022102463141 | |
| dc.language.iso | en | |
| dc.okm.affiliatedauthor | Suominen, Hanna | |
| dc.okm.discipline | 113 Computer and information sciences | en_GB |
| dc.okm.discipline | 113 Tietojenkäsittely ja informaatiotieteet | fi_FI |
| dc.okm.internationalcopublication | international co-publication | |
| dc.okm.internationality | International publication | |
| dc.okm.type | A4 Conference Article | |
| dc.publisher.country | United States | en_GB |
| dc.publisher.country | Yhdysvallat (USA) | fi_FI |
| dc.publisher.country-code | US | |
| dc.relation.conference | Annual Meeting of the Association for Computational Linguistics | |
| dc.relation.doi | 10.18653/v1/2022.acl-srw.29 | |
| dc.source.identifier | https://www.utupub.fi/handle/10024/168290 | |
| dc.title | Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction | |
| dc.title.book | Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop | |
| dc.year.issued | 2022 |
Tiedostot
1 - 1 / 1
Ladataan...
- Name:
- Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction.pdf
- Size:
- 617.38 KB
- Format:
- Adobe Portable Document Format