Skip to content

[TMLR 2024] Official implementation of "NuTime: Numerically Multi-Scaled Embedding for Large-Scale Time-Series Pretraining". (The code is under review by Microsoft)

License

Notifications You must be signed in to change notification settings

chenguolin/NuTime

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 

Repository files navigation

[TMLR 2024] NuTime

NuTime: Numerically Multi-Scaled Embedding for Large-Scale Time-Series Pretraining

Chenguo Lin, Xumeng Wen, Wei Cao, Congrui Huang, Jiang Bian, Stephen Lin, Zhirong Wu

OpenReview arXiv License: MIT

pipeline

This repository contains the official implementation of the paper: NuTime: Numerically Multi-Scaled Embedding for Large-Scale Time-Series Pretraining, which is accepted to TMLR 2024. In this work, we propose the NuTime model for large-scale time series pretraining. The model is based on the Transformer architecture, which takes input as a set of tokens from non-overlapping windows. Each window is represented by its normalized shape, the window mean and the window standard deviation. We develop a numerically multi-scaled embedding method (NME) for representing the scalar values of mean and std. The model can take raw values of time-series data as input without any data normalization and transformation.

Feel free to contact me (chenguolin@stu.pku.edu.cn) or open an issue if you have any questions or suggestions.

📢 News

  • 2024-07-15: It might take some time to clean the entire codebase for releasing, so we first provide the code about window & mean & std embeddings, which is the essential part of the proposed NuTime, at here.
  • 2024-07-10: NuTime is accepted to TMLR 2024.

📋 TODO

  • Release the training and evaluation code
  • Release the self-supervised pretrained NuTime
  • Release the large-scale merged datasets for pretraining

📚 Citation

If you find our work helpful, please consider citing:

@article{lin2024nutime,
  title={NuTime: Numerically Multi-Scaled Embedding for Large-Scale Time-Series Pretraining},
  author={Chenguo Lin and Xumeng Wen and Wei Cao and Congrui Huang and Jiang Bian and Stephen Lin and Zhirong Wu},
  journal={Transactions on Machine Learning Research (TMLR)},
  year={2024}
}

About

[TMLR 2024] Official implementation of "NuTime: Numerically Multi-Scaled Embedding for Large-Scale Time-Series Pretraining". (The code is under review by Microsoft)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages