This repo is a preview version and has not been fully tested yet. Feel free to create any issue!
🚩 (2025.09) Our paper has been accepted at NeurIPS 2025!
🚩 (2025.05) Our preprint is made available at ArXiv.
First, all datasets can be download from Google Drive, and the dataset path can be specified by --root_path ./dataset/.
Full Finetune:
sh example/post-training/finetune/Chronos.sh "ETTh1 ETTh2 ETTm1 ETTm2 weather"
sh example/post-training/finetune/test/Chronos.sh "ETTh1 ETTh2 ETTm1 ETTm2 weather"Prune-then-Finetune:
sh example/post-training/prune-then-finetune/prune/Chronos.sh "ETTh1 ETTh2 ETTm1 ETTm2 weather"
sh example/post-training/prune-then-finetune/finetune/Chronos.sh "ETTh1 ETTh2 ETTm1 ETTm2 weather"
sh example/post-training/prune-then-finetune/test/Chronos.sh "ETTh1 ETTh2 ETTm1 ETTm2 weather"All scripts could be found in example/benchmark/.
If you find our code helpful, please consider citing our paper:
@inproceedings{
less-is-more-prune-then-finetune,
title={Less is More: Unlocking Specialization of Time Series Foundation Models via Structured Pruning},
author={Lifan Zhao and Yanyan Shen and Zhaoyang Liu and Xue Wang and Jiaji Deng},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://openreview.net/forum?id=jy4bBsr1Jc}
}