Scaling laws for large time-series models: More data, more parameters
- đ¤ Speaker: James Alvey (KICC)
- đ Date & Time: Monday 28 October 2024, 16:00 - 17:00
- đ Venue: Martin Ryle Seminar Room, KICC
Abstract
Scaling laws for large language models (LLMs) offer valuable insights into how increasing model size and training data leads to predictable performance improvements. Time series forecasting, which shares a sequential structure similar to language, is well-suited to large-scale transformer architectures. In this talk, I will demonstrate that foundational decoder-only time series transformer models exhibit scaling behaviour analogous to LLMs. I will begin with a general introduction to scaling laws and how they can inform efficient, optimised model training. I will then focus on their specific application to time series data, highlighting the emergence of power law behaviour. Finally, I will discuss the broader implications of these findings, and potential scientific applications.
Related papers: https://arxiv.org/abs/2001.08361 https://arxiv.org/abs/2405.13867
Series This talk is part of the Astro Data Science Discussion Group series.
Included in Lists
- Astro Data Science Discussion Group
- Cambridge Astronomy Talks
- Combined External Astrophysics Talks DAMTP
- Cosmology, Astrophysics and General Relativity
- Institute of Astronomy Extra Talks
- Institute of Astronomy Talk Lists
- Martin Ryle Seminar Room, KICC
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Monday 28 October 2024, 16:00-17:00