README / README.md
Eghbal's picture
Update README.md
e51064c verified
---
title: FinText-TSFM
emoji: πŸ“ˆ
colorFrom: gray
colorTo: blue
sdk: static
pinned: false
---
[![SSRN](https://img.shields.io/badge/SSRN-5770562-1a5dab?logo=ssrn&logoColor=white)](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)
[![arXiv](https://img.shields.io/badge/arXiv-2511.18578-b31b1b?logo=arxiv&logoColor=white)](https://www.arxiv.org/abs/2511.18578)
[![ResearchGate](https://img.shields.io/badge/ResearchGate-Paper-00CCBB?logo=researchgate&logoColor=white)](https://www.researchgate.net/publication/397872068_ReVisiting_Time_Series_Foundation_Models_in_Finance)
[![Website - FinText.ai](https://img.shields.io/badge/Website-FinText.ai-0A66C2?logo=google-chrome&logoColor=white)](https://fintext.ai)
[![GitHub - FinText.ai](https://img.shields.io/badge/GitHub-FinText.ai-181717?logo=github&logoColor=white)](https://github.com/DeepIntoStreams/TSFM_Finance)
## 🎀 Podcast
You can now listen to the accompanying podcast here: https://soundcloud.com/eghbal-rahimikia/revisiting-time-series-foundation-models-in-finance
## πŸ†• GitHub Model Loading Support (NEW)
All models can now be loaded directly from GitHub. The repository includes utilities and setup instructions.
πŸ”— **https://github.com/DeepIntoStreams/TSFM_Finance**
## πŸš€ TSFMs Release
We are pleased to introduce **FinText-TSFM**, a comprehensive suite of **time series foundation models (TSFMs)** with 613 models pre-trained for quantitative finance. This release accompanies the paper :
**[*Re(Visiting) Time Series Foundation Models in Finance*](https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)** by *Eghbal Rahimikia, Hao Ni, and Weiguan Wang (2025)*.
### πŸ’‘ Key Highlights
- **Finance-Native Pre-training:**
Models are pre-trained **from scratch** on large-scale financial time series datasets β€” including daily excess returns across **89 markets** and **over 2 billion observations** β€” to ensure full temporal and domain alignment.
- **Bias-Free Design:**
Pre-training strictly follows a **chronological expanding-window setup**, avoiding any **look-ahead bias** or **information leakage**.<br>
Each variation includes 23 separately pre-trained models, corresponding to each year from **2000** to **2023**, with data starting in 1990.
- **Model Families:**
This release includes variants of **Chronos** and **TimesFM** architectures adapted for financial time series:
- Chronos-Tiny (8M) / Mini (20M) / Small (46M)
- TimesFM-8M / 20M
- **Model Collections:**
- U.S.: Covers **U.S.** market-wide excess returns from 2000 to 2023, with one pre-trained model per year.
- Global: Covers excess returns across **94 global markets** from 2000 to 2023, with one pre-trained model for each year.
- Augmented: Extends the global data with **augmented factors** from 2000 to 2023, with one pre-trained model for each year.
- The remaining **253 pre-trained models** are available for download via the [**FinText.ai Portal**](https://fintext.ai). These include models pre-trained with varying **hyperparameter configurations** for extended experimentation and performance comparison.
- **Performance Insights:**
Our findings show that **off-the-shelf TSFMs** underperform in zero-shot forecasting, while **finance-pretrained models** achieve large gains in both predictive accuracy and portfolio performance.
- **Evaluation Scope:**
Models are benchmarked across **U.S. and seven international markets**, using rolling windows of **5, 21, 252, and 512 days**, with over **18 million out-of-sample forecasts** spanning **22 years (2001–2023)** of daily excess returns, evaluated at both the **statistical** and **economic performance** levels.
### 🧠 Technical Overview
- **Architecture:** Transformer-based TSFMs (Chronos & TimesFM)
- **Compute:** 50,000 GPU hours on NVIDIA GH200 Grace Hopper clusters
### πŸ“š Citation
Please cite the accompanying paper if you use these models:
> **Re(Visiting) Time Series Foundation Models in Finance.**
> **Rahimikia, Eghbal; Ni, Hao; Wang, Weiguan.**
> SSRN: [https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5770562)
### πŸ”‹ Acknowledgments
This project was made possible through computational and institutional support from:
- **UK Research and Innovation (UKRI)**
- **Isambard-AI National AI Research Resource (AIRR)**
- **Alliance Manchester Business School (AMBS), University of Manchester**
- **N8 Centre of Excellence in Computationally Intensive Research (N8 CIR)**
- **The University of Manchester** (Research IT & Computational Shared Facility)
- **University College London (UCL)**
- **The Alan Turing Institute**
- **Shanghai University**
---
<div style="text-align:center; margin:auto; max-width:800px;">
<!-- Developed by -->
<div style="margin-bottom:12px;">
<p style="font-weight:bold; font-size:1.1em; margin:4px 0;">Developed by:</p>
<div style="display:flex; justify-content:center; align-items:center; gap:20px; flex-wrap:wrap; margin-bottom:15px;">
<img src="https://fintext.ai/UoM-logo.svg" alt="University of Manchester Logo" width="210" style="display:block; margin:0;">
<img src="https://fintext.ai/UCL-logo.jpg" alt="UCL Logo" width="100" style="display:block; margin:0;">
</div>
<p style="font-size:0.8em; margin-top:0; line-height:1.3;">
Alliance Manchester Business School, University of Manchester<br>
Department of Mathematics, University College London (UCL)
</p>
</div>
<!-- Powered by -->
<div>
<p style="font-weight:bold; font-size:1.1em; margin:4px 0;">Powered by:</p>
<div style="display:flex; justify-content:center; align-items:center; gap:20px; flex-wrap:wrap; margin-bottom:10px;">
<img src="https://fintext.ai/BriCS-logo.png" alt="BriCS Logo" width="180" style="display:block; margin:0;">
<img src="https://fintext.ai/N8_bede_logo.webp" alt="N8 Bede Logo" width="140" style="display:block; margin:0;">
</div>
<p style="font-size:0.8em; margin-top:0; line-height:1.3;">
Isambard-AI, Bristol Centre for Supercomputing (BriCS)<br>
The Bede Supercomputer
</p>
</div>
</div>