Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction

dc.contributor.authorPapaluca Andrea
dc.contributor.authorKrefl Daniel
dc.contributor.authorSuominen Hanna
dc.contributor.authorLenskiy Artem
dc.contributor.organizationfi=tietotekniikan laitos|en=Department of Computing|
dc.contributor.organization-code1.2.246.10.2458963.20.85312822902
dc.converis.publication-id176250533
dc.converis.urlhttps://research.utu.fi/converis/portal/Publication/176250533
dc.date.accessioned2022-10-28T13:55:20Z
dc.date.available2022-10-28T13:55:20Z
dc.description.abstractIn this work we put forward to combine pre-trained knowledge base graph embeddings with transformer based language models to improve performance on the sentential Relation Extraction task in natural language processing. Our proposed model is based on a simple variation of existing models to incorporate off-task pre-trained graph embeddings with an on-task finetuned BERT encoder. We perform a detailed statistical evaluation of the model on standard datasets. We provide evidence that the added graph embeddings improve the performance, making such a simple approach competitive with the state-of-the-art models that perform explicit on-task training of the graph embeddings. Furthermore, we observe for the underlying BERT model an interesting power-law scaling behavior between the variance of the F1 score obtained for a relation class and its support in terms of training examples.
dc.format.pagerange373
dc.format.pagerange382
dc.identifier.isbn978-1-955917-23-0
dc.identifier.olddbid185196
dc.identifier.oldhandle10024/168290
dc.identifier.urihttps://www.utupub.fi/handle/11111/42040
dc.identifier.urlhttps://aclanthology.org/2022.acl-srw.29/
dc.identifier.urnURN:NBN:fi-fe2022102463141
dc.language.isoen
dc.okm.affiliatedauthorSuominen, Hanna
dc.okm.discipline113 Computer and information sciencesen_GB
dc.okm.discipline113 Tietojenkäsittely ja informaatiotieteetfi_FI
dc.okm.internationalcopublicationinternational co-publication
dc.okm.internationalityInternational publication
dc.okm.typeA4 Conference Article
dc.publisher.countryUnited Statesen_GB
dc.publisher.countryYhdysvallat (USA)fi_FI
dc.publisher.country-codeUS
dc.relation.conferenceAnnual Meeting of the Association for Computational Linguistics
dc.relation.doi10.18653/v1/2022.acl-srw.29
dc.source.identifierhttps://www.utupub.fi/handle/10024/168290
dc.titlePretrained Knowledge Base Embeddings for improved Sentential Relation Extraction
dc.title.bookProceedings of the 60th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
dc.year.issued2022

Tiedostot

Näytetään 1 - 1 / 1
Ladataan...
Name:
Pretrained Knowledge Base Embeddings for improved Sentential Relation Extraction.pdf
Size:
617.38 KB
Format:
Adobe Portable Document Format