Logistic and Softmax Regression, and their Relation to the Neural Network World
- ๐ค Speaker: Adrian Scoica (University of Cambridge)
- ๐ Date & Time: Monday 19 May 2014, 14:00 - 15:00
- ๐ Venue: LT2, Computer Laboratory, William Gates Building
Abstract
Logistic Regression, along with its generalized counterpart Softmax Regression, is one of the most popular and best-performing generalized linear classification algorithms currently used in Machine Learning. In this lecture, I will explain some of the intuitions behind using and training these classifiers, and I will show how they are related to Neural Networks.
Using cleverly assembled examples from the harsh and unforgiving world of dating, I will reveal the probabilistic concepts that lie behind the logistic function. I will demonstrate how to train a binary logistic regression classifier using gradient descent, and I will show how those intuitions generalize naturally to the multi-class problem. Last but not least, we will see how these classifiers can be thought of as very simple Artificial Neural Networks, and thus can be used as layer components in more complicated Neural Network architectures.
The slides can be viewed here.
Series This talk is part of the Computer Laboratory Research Students' Lectures 2014 series.
Included in Lists
- Computer Laboratory Research Students' Lectures 2014
- LT2, Computer Laboratory, William Gates Building
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Monday 19 May 2014, 14:00-15:00