A comprehensive summary of the latest LLM-based time series forecasting models that have been published at top conferences (NeurIPS, ICLR, ICML, AAAI, etc.) and have released their source code.
All models are LLM-based or LLM-empowered for time series tasks.
All models released official GitHub code (at least training/inference code).
Most models focus on forecasting; a few also cover classification, anomaly detection, or multi-modal retrieval.
The latest trend is foundation modeling and cross-modal learning for time series using LLMs.
・FSTLLM FSTLLM: Spatio-Temporal LLM for Few Shot Time Series Forecasting [ICML 2025] [Code]
・LLM4TS LLM4TS: Aligning Pre-Trained LLMs as Data-Efficient Time-Series Forecasters [ACM TIST 2025] [Code]
・TimeCMA TimeCMA: Towards LLM-Empowered Multivariate Time Series Forecasting via Cross-Modality Alignment [AAAI 2025] [Code]
・ChatTime ChatTime: A Unified Multimodal Time Series Foundation Model Bridging Numerical and Textual Data [AAAI 2025] [Code]
・FSCA Context-Alignment: Activating and Enhancing LLMs Capabilities in Time Series [ICLR 2025] [Code]
・Timer-XL Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [ICLR 2025] [Code]
・Time-MoE Time-MoE: Billion-Scale Time Series Foundation Models With Mixture Of Experts [ICLR 2025] [Code]
・ICTSP In-context Time Series Predictor [ICLR 2025] [Code]
・UniTime UniTime: A Language-Empowered Unified Model for Cross-Domain Time Series Forecasting [WWW 2024] [Code]
・Chronos Chronos: Learning the Language of Time Series [TMLR 2024] [Code]
・LMTraj Can Language Beat Numerical Regression? Language-Based Multimodal Trajectory Prediction [CVPR 2024] [Code]
・TEST TEST: Text Prototype Aligned Embedding to Activate LLM's Ability for Time Series [ICLR 2024] [Code]
・S²IP-LLM S²IP-LLM: Semantic Space Informed Prompt Learning with LLM for Time Series Forecasting [ICML 2024] [Code]
・aLLM4TS Multi-Patch Prediction: Adapting LLMs for Time Series Representation Learning [ICML 2024] [Code]
・ST-LLM ST-LLM: Large Language Models Are Effective Temporal Learners [MDM 2024] [Code]
・Moment MOMENT: A Family of Open Time-series Foundation Model [ICML 2024] [Code]
・Timer Timer: Generative Pre-trained Transformers Are Large Time Series Models [ICML 2024] [Code]
・Moirai Unified Training of Universal Time Series Forecasting Transformers [ICML 2024] [Code]
・GPT4MTS Prompt-based Large Language Model for Multimodal Time-Series Forecasting [AAAI 2024] [Code]
・TimesFM A decoder-only foundation model for time-series forecasting [ICML 2024] [Code]
・From_News_to_Forecast From News to Forecast: Integrating Event Analysis in LLM-Based Time Series Forecasting with Reflection [NeurIPS 2024] [Code]
・AutoTimes Autoregressive Time Series Forecasters via Large Language Models [NeurIPS 2024] [Code]
・TEMPO TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting [ICLR 2024] [Code]
・AutoTimes AutoTimes: Autoregressive Time Series Forecasters via Large Language Models [NeurIPS 2024] [Code]
・Time-LLM Time-LLM: Time Series Forecasting by Reprogramming Large Language Models [ICLR 2024] [Code]
・TimeGPT-1 TimeGPT-1 [Nixtla 2024] [Code]
・OFA One Fits All:Power General Time Series Analysis by Pretrained LM [NeurIPS 2023] [Code]
・LLMTime Large Language Models Are Zero-Shot Time Series Forecasters [NeurIPS 2023] [Code]
・Lag-Llama Lag-Llama: Towards Foundation Models for Probabilistic Time Series Forecasting [NeurIPS 2023 Workshop] [Code]