Difference between revisions of "Lightweight foundation model for time series classification"

From ISLAB/CAISR
Jump to navigationJump to search
(Created page with "{{StudentProjectTemplate |Summary=Lightweight foundation model for time series classification |TimeFrame=Fall 2025 |Supervisor=Aurora Esteban Toscano & Slawomir Nowaczyk |Leve...")
 
 
Line 2: Line 2:
 
|Summary=Lightweight foundation model for time series classification
 
|Summary=Lightweight foundation model for time series classification
 
|TimeFrame=Fall 2025
 
|TimeFrame=Fall 2025
|Supervisor=Aurora Esteban Toscano & Slawomir Nowaczyk
+
|Supervisor=Aurora Esteban Toscano, Slawomir Nowaczyk
 
|Level=Master
 
|Level=Master
 
|Status=Open
 
|Status=Open

Latest revision as of 09:25, 13 October 2025

Title Lightweight foundation model for time series classification
Summary Lightweight foundation model for time series classification
Keywords
TimeFrame Fall 2025
References
Prerequisites
Author
Supervisor Aurora Esteban Toscano, Slawomir Nowaczyk
Level Master
Status Open


Time series data is everywhere — from industrial sensors and financial markets to healthcare monitoring and environmental systems. Classifying time series patterns is key to many applications, such as detecting equipment failures, predicting stock movements, or diagnosing medical conditions. However, training accurate models for time series classification (TSC) often requires large, labeled datasets, which are expensive and time-consuming to obtain.

Foundation models have transformed fields like natural language processing and computer vision by enabling powerful generalization from large-scale pretraining. In time series, similar models have started to emerge, mainly for forecasting tasks. Such models can be adapted efficiently to new datasets and domains — a huge advantage when data is scarce or labeling is difficult.

Existing time series foundation models primarily target forecasting (e.g., TimeGPT, Chronos), whereas classification-focused foundation models remain underexplored. Recent efforts like “Moment" demonstrate the potential of pretraining on large, diverse time series collections, but most current approaches rely on transformer architectures, which are computationally heavy and memory-intensive.

This project proposes to develop a fast, lightweight foundation model for time series classification, following principles introduced by efficient architectures such as "Tiny Time Mixers". The objective is to explore whether compact, mixer-style models can capture rich temporal representations while maintaining strong generalization and computational efficiency.

The work will be done in connection with the KEEPER project using data from our industrial partners such as Volvo, HMS, Toyota, etc. https://www.hh.se/english/research/our-research/research-at-the-school-of-information-technology/technology-area-aware-intelligent-systems/research-projects-within-aware-intelligent-systems/keeper---knowledge-creation-for-efficient-and-predictable-industrial-operations-.html