University of Cambridge > Talks.cam > NLIP Seminar Series > Handling obsolete information in classification: is there a one-size-fits-all strategy?

Handling obsolete information in classification: is there a one-size-fits-all strategy?

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Ekaterina Kochmar.

Classifier performance is known to deteriorate over time, as a result of the training dataset becoming obsolete as time goes by. A particularly common instance of this is the phenomenon of “population drift”, whereby gradual changes in the population characteristics have a cumulative detrimental effect on the accuracy of the classifier over time. In online learning contexts, where the classifier is updated incrementally as new information arrives, this raises the question of whether novel information should complement, or altogether replace older data. In this talk, we illustrate how such decisions hinge not only on the precise nature of the evolution exhibited by the data, but also on the methodology that underlies the classifier: different classifiers are affected in different ways by drift. This observation has been studied by various authors in the streaming data literature, and in this talk we review and extend this work. We conclude by commenting on the suitability of one-size-fits-all strategies for handling drift, that monitor performance alone and are otherwise indifferent to the underlying methodology.

This talk is part of the NLIP Seminar Series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2023 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity