FinGPT: Large Generative Models for a Small Language

dc.contributor.authorLuukkonen Risto
dc.contributor.authorKomulainen Ville
dc.contributor.authorLuoma Jouni
dc.contributor.authorEskelinen Anni
dc.contributor.authorKanerva Jenna
dc.contributor.authorKupari Hanna-Mari
dc.contributor.authorGinter Filip
dc.contributor.authorLaippala Veronika
dc.contributor.authorMuennighoff Niklas
dc.contributor.authorPiktus Aleksandra
dc.contributor.authorWang Thomas
dc.contributor.authorTazi Nouamane
dc.contributor.authorScao Le Teven
dc.contributor.authorWolf Thomas
dc.contributor.authorSuominen Osma
dc.contributor.authorSairanen Samuli
dc.contributor.authorMerioksa Mikko
dc.contributor.authorHeinonen Jyrki
dc.contributor.authorVahtola Aija
dc.contributor.authorAntao Samuel
dc.contributor.authorPyysalo Sampo
dc.contributor.organizationfi=data-analytiikka|en=Data-analytiikka|
dc.contributor.organizationfi=kieli- ja käännöstieteiden laitos|en=School of Languages and Translation Studies|
dc.contributor.organization-code1.2.246.10.2458963.20.56461112866
dc.contributor.organization-code1.2.246.10.2458963.20.68940835793
dc.contributor.organization-code2602100
dc.contributor.organization-code2610301
dc.converis.publication-id182054173
dc.converis.urlhttps://research.utu.fi/converis/portal/Publication/182054173
dc.date.accessioned2025-08-28T02:58:55Z
dc.date.available2025-08-28T02:58:55Z
dc.description.abstract<p> Large language models (LLMs) excel in many tasks in NLP and beyond, but most open models have very limited coverage of smaller languages and LLM work tends to focus on languages where nearly unlimited data is available for pretraining. In this work, we study the challenges of creating LLMs for Finnish, a language spoken by less than 0.1% of the world population. We compile an extensive dataset of Finnish combining web crawls, news, social media and eBooks. We pursue two approaches to pretrain models: 1) we train seven monolingual models from scratch (186M to 13B parameters) dubbed FinGPT, 2) we continue the pretraining of the multilingual BLOOM model on a mix of its original training data and Finnish, resulting in a 176 billion parameter model we call BLUUMI. For model evaluation, we introduce FIN-bench, a version of BIG-bench with Finnish tasks. We also assess other model qualities such as toxicity and bias. Our models and tools are openly available at <a href="https://turkunlp.org/gpt3-finnish">https://turkunlp.org/gpt3-finnish</a>. <br></p>
dc.format.pagerange2710
dc.format.pagerange2726
dc.identifier.isbn979-8-89176-060-8
dc.identifier.olddbid210018
dc.identifier.oldhandle10024/193045
dc.identifier.urihttps://www.utupub.fi/handle/11111/50104
dc.identifier.urlhttps://aclanthology.org/2023.emnlp-main.164
dc.identifier.urnURN:NBN:fi-fe2025082792575
dc.language.isoen
dc.okm.affiliatedauthorLuukkonen, Risto
dc.okm.affiliatedauthorKomulainen, Ville
dc.okm.affiliatedauthorLuoma, Jouni
dc.okm.affiliatedauthorEskelinen, Anni
dc.okm.affiliatedauthorKanerva, Jenna
dc.okm.affiliatedauthorKupari, Hanna-Mari
dc.okm.affiliatedauthorGinter, Filip
dc.okm.affiliatedauthorLaippala, Veronika
dc.okm.affiliatedauthorPyysalo, Sampo
dc.okm.discipline113 Computer and information sciencesen_GB
dc.okm.discipline113 Tietojenkäsittely ja informaatiotieteetfi_FI
dc.okm.internationalcopublicationnot an international co-publication
dc.okm.internationalityInternational publication
dc.okm.typeA4 Conference Article
dc.publisher.countryUnited Statesen_GB
dc.publisher.countryYhdysvallat (USA)fi_FI
dc.publisher.country-codeUS
dc.relation.conferenceConference on Empirical Methods in Natural Language Processing
dc.relation.doi10.18653/v1/2023.emnlp-main.164
dc.source.identifierhttps://www.utupub.fi/handle/10024/193045
dc.titleFinGPT: Large Generative Models for a Small Language
dc.title.bookProceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
dc.year.issued2023

Tiedostot

Näytetään 1 - 1 / 1
Ladataan...
Name:
2023.emnlp-main.164.pdf
Size:
499.23 KB
Format:
Adobe Portable Document Format