University of Cambridge > > NLIP Seminar Series > Internal Seminar

Internal Seminar

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Tamara Polajnar.

This seminar is open to members of the NLIP group and MPhil students doing NLP projects in the Computer Lab. It features 4 talks:

An Exploration of Discourse-Based Sentence Spaces for Compositional Distributional Semantics

Tamara Polajnar and Laura Rimell

We investigate whether the wider context in which a sentence is located can contribute to a distributional representation of sentence meaning. We compare a vector space for sentences in which the features are words occurring within the sentence, with two new vector spaces that only make use of surrounding context. Experiments on simple subject-verb-object similarity tasks show that all sentence spaces produce results that are comparable with previous work. However, qualitative analysis and user experiments indicate that extra-sentential contexts capture more diverse, yet topically coherent information.


What Happens Next? Suggesting Events in Narratives using Neural Event Representations

Mark Granroth-Wilding


Online Representation Learning in Recurrent Neural Language Models

Marek Rei

This work extends recurrent neural network language models with continuous online learning. The model keeps a separate vector representation of the current unit of text being processed and adaptively adjusts it after each prediction. The initial experiments give promising results, indicating that the method is able to increase language modelling accuracy, while also decreasing the parameters needed to store the model along with the computation required at each step.


What happens when you encode a dictionary or encyclopedia in a neural net

Felix Hill

This talk is part of the NLIP Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.


© 2006-2021, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity