Model card
iTransformer.
THUMLopen-sourceTransformer (inverted attention)
Inverts the attention mechanism to model multivariate correlations across channels rather than time steps.
§ 01 · Benchmarks
Every benchmark iTransformer has a recorded score for.
| # | Benchmark | Area · Task | Metric | Value | Rank | Date | Source |
|---|---|---|---|---|---|---|---|
| 01 | Weather | Time Series · Time Series Forecasting | mae | 0.3% | #5 | 2024-05-07 | source ↗ |
| 02 | Weather | Time Series · Time Series Forecasting | mse | 0.3% | #5 | 2024-05-07 | source ↗ |
| 03 | M4 Competition | Time Series · Time Series Forecasting | mase | 1.8% | #5 | 2023-10-10 | source ↗ |
| 04 | M4 Competition | Time Series · Time Series Forecasting | owa | 0.9% | #5 | 2023-10-10 | source ↗ |
| 05 | M4 Competition | Time Series · Time Series Forecasting | smapi | 12.7% | #6 | 2023-10-10 | source ↗ |
Rank column shows this model’s position vs all other models scored on the same benchmark + metric (competitors after the slash). #1 in red means current SOTA. Sorted by rank, then newest result.
§ 03 · Papers
2 papers with results for iTransformer.
- 2024-05-07· Time Series· 2 results
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting
Yong Liu, Tengge Hu, Haoran Zhang, Ling Jin et al. - 2023-10-10· Time Series· 3 results
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting
Yong Liu, Tengge Hu, Haoran Zhang, Ling Jin et al.
§ 04 · Related models
Other THUML models scored on Codesota.
§ 05 · Sources & freshness
Where these numbers come from.
TimeMixer++ Table 2
3
results
iTransformer Table 1
2
results
5 of 5 rows marked verified. · first result 2023-10-10, latest 2024-05-07.