Internship in Foundation Models for Time Series Data


Facing the Challenges of our time

Help us grow and be more impactful!

The “Predictive Analytics” group based in Alpanch, Switzerland, is currently looking for an intern in Foundation Models for Time Series Data. 

Foundation models have revolutionized natural language processing (NLP) and computer vision [1,2]. These models leverage vast amounts of data and computational resources to train large-scale networks, demonstrating remarkable generalizability and enabling applications such as ChatGPT and Gemini. However, time series analysis has not yet fully benefited from this revolution. In this field, the standard approach still involves retraining a single model for each specific use case and task. One of the primary reasons for this is the nature of the data. While time series data is ubiquitous, it is often difficult to interpret and highly heterogeneous.

Nonetheless, recent studies have shown the potential of the self-supervised pre-training paradigm for univariate time series [3,4]. Additionally, other studies have demonstrated that large language models (LLMs) can also handle time series data with minimal adaptation [5,6]

While these studies are still preliminary, they represent promising steps toward the utilization of foundation models for time series analysis. At CSEM, where we work daily on projects involving time series data, a universal model for this modality would be highly beneficial. Consequently, we aim to explore this topic further in this master thesis. Specifically, we are interested in addressing the following open points:

1. Tokenization and Embedding Representation: What is the best tokenization and embedding representation for time series data? Should numbers be treated as text, or is a continuous representation more effective?

2. In-Context Learning Capabilities of Foundation Models for Time Series: What are the few-shot learning limits of foundation models for time series? What is the optimal way to condition the model with examples?

3. General Handling of Multivariate Time Series: What pre-training strategies can we adopt to create a highly generalizable model that does not require a predefined number of input channels?

By investigating these questions, we hope to advance the application of foundation models in time series analysis, ultimately contributing to more efficient and effective modeling techniques in this domain. [1] [2] [3] [4] [5] [6]



We are seeking a Master's student with a background in Machine Learning and Deep Learning. Proficiency in Python is essential. The ideal candidate should be an independent and eager learner. Previous experience in publishing scientific papers would be highly advantageous, of particular interest are ICML, NeurIPS, AAAI, or ICLR. The candidate must be willing to work at CSEM (Alpnach) at least three times a week.



The position would start in September/October 2024 and has a duration of six months.

CSEM mission and values

Our mission is the development and transfer of innovative technologies to the Swiss industry. Our objective is to make an impact on our customers and on society at large in the fields of precision manufacturing, digital technologies and sustainable energy. Our strength is the excellence of our people, about 550 passionate specialists dedicated to innovation and technology transfer. We believe that strong values support the successful development of our organization as well as the harmonious and balanced development of all our employees.

We are

  • A unique place between research and industry at the cutting edge of new technologies
  • An innovative, non-profit, and employee-driven company
  • A dynamic, multidisciplinary, and multicultural environment
  • A solar team focused on enabling solutions to energy challenges for a sustainable world

Working@CSEM means

  • being part of a passionate community
  • incredible flexibility, attractive working conditions, and great opportunities of development
  • benefit from a management style based on trust & feedback and that favors a work-life balance

We are committed to equal employment opportunity regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity.

We look forward to receiving your complete application file via (CV, cover letter, certificates & diplomas) our job page.

Preference will be given to professionals applying directly.