University of Cambridge > Talks.cam > Machine Learning @ CUED > Matrix Factorization and Relational Learning

Matrix Factorization and Relational Learning

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Zoubin Ghahramani.

Matrix factorization is one of the workhorse methods in data mining, machine learning, and information retrieval. We present a unified view of matrix factorization models, which includes weighted singular value decompositions, non-negative matrix factorization, probabilistic latent semantic indexing, max-margin matrix factorization, matrix co-clustering, and generalizations of these models to exponential family distributions. This unified view leads to a class of optimization algorithms, based on alternating projections and stochastic approximations, which are well-suited to models of large, sparse matrices.

Extending upon our unified view of matrix factorization, many types of relational data can be presented as a set of related matrices, where shared dimensions correspond to shared factors in a low-rank representation. We extend Bregman matrix factorization to a set of related matrices, illustrating the use of relational learning on a collaborative filtering problem.

This talk is based primarily on two publications: Relational Learning via Collective Matrix Factorization (Singh & Gordon, KDD -2008), and A Unified View of Matrix Factorization Models (Singh & Gordon, ECML /PKDD-2008).

This talk is part of the Machine Learning @ CUED series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2019 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity