A two-step learning approach for solving full and almost full cold start problems in dyadic prediction

dc.contributor.authorTapio Pahikkala
dc.contributor.authorMichiel Stock
dc.contributor.authorAntti Airola
dc.contributor.authorTero Aittokallio
dc.contributor.authorBernard De Baets
dc.contributor.authorWillem Waegeman
dc.contributor.organizationfi=matematiikan ja tilastotieteen laitos|en=Department of Mathematics and Statistics|
dc.contributor.organizationfi=tietojenkäsittelytiede|en=Computer Science|
dc.contributor.organizationfi=tietotekniikan laitos|en=Department of Computing|
dc.contributor.organization-code1.2.246.10.2458963.20.23479734818
dc.contributor.organization-code1.2.246.10.2458963.20.46717060993
dc.contributor.organization-code1.2.246.10.2458963.20.85312822902
dc.converis.publication-id3746444
dc.converis.urlhttps://research.utu.fi/converis/portal/Publication/3746444
dc.date.accessioned2022-10-28T12:37:20Z
dc.date.available2022-10-28T12:37:20Z
dc.description.abstract<p> Dyadic prediction methods operate on pairs of objects (dyads), aiming to infer labels for out-of-sample dyads. We consider the full and almost full cold start problem in dyadic prediction, a setting that occurs when both objects in an out-of-sample dyad have not been observed during training, or if one of them has been observed, but very few times. A popular approach for addressing this problem is to train a model that makes predictions based on a pairwise feature representation of the dyads, or, in case of kernel methods, based on a tensor product pairwise kernel. As an alternative to such a kernel approach, we introduce a novel two-step learning algorithm that borrows ideas from the fields of pairwise learning and spectral filtering. We show theoretically that the two-step method is very closely related to the tensor product kernel approach, and experimentally that it yields a slightly better predictive performance. Moreover, unlike existing tensor product kernel methods, the two-step method allows closed-form solutions for training and parameter selection via cross-validation estimates both in the full and almost full cold start settings, making the approach much more efficient and straightforward to implement.</p>
dc.format.pagerange517
dc.format.pagerange532
dc.identifier.eisbn978-3-662-44851-9
dc.identifier.isbn978-3-662-44850-2
dc.identifier.jour-issn0302-9743
dc.identifier.olddbid177746
dc.identifier.oldhandle10024/160840
dc.identifier.urihttps://www.utupub.fi/handle/11111/34489
dc.identifier.urnURN:NBN:fi-fe2021042715307
dc.okm.affiliatedauthorAirola, Antti
dc.okm.affiliatedauthorAittokallio, Tero
dc.okm.affiliatedauthorPahikkala, Tapio
dc.okm.discipline113 Computer and information sciencesen_GB
dc.okm.discipline113 Tietojenkäsittely ja informaatiotieteetfi_FI
dc.okm.internationalcopublicationinternational co-publication
dc.okm.internationalityInternational publication
dc.okm.typeA4 Conference Article
dc.relation.conferenceThe European Conferences on Machine Learning (ECML) and on Principles and Practice of Knowledge Discovery in Data Bases (PKDD)
dc.relation.doi10.1007/978-3-662-44851-9_33
dc.relation.ispartofjournalLecture Notes in Computer Science
dc.relation.ispartofseriesLecture Notes in Computer Science
dc.relation.volume8725
dc.source.identifierhttps://www.utupub.fi/handle/10024/160840
dc.titleA two-step learning approach for solving full and almost full cold start problems in dyadic prediction
dc.title.bookMachine Learning and Knowledge Discovery in Databases (ECML PKDD 2014)
dc.year.issued2014

Tiedostot

Näytetään 1 - 1 / 1
Ladataan...
Name:
paperBis.pdf
Size:
439.63 KB
Format:
Adobe Portable Document Format