Seventy3: 用NotebookLM将论文生成播客,让大家跟着AI一起进步。
今天的主题是:
iTransformer: Inverted Transformers Are Effective for Time Series Forecasting
Summary
The paper introduces iTransformer, a novel architecture for time series forecasting that inverts the standard Transformer structure. Instead of embedding multiple variates at each timestamp, iTransformer embeds each time series individually as a token, applying attention to capture multivariate correlations and feed-forward networks to learn series-specific representations. This approach achieves state-of-the-art results on several real-world datasets, showcasing improved performance and generalization compared to existing Transformer-based and linear models, particularly with longer lookback windows. The authors provide extensive experimental results and analysis to support their claims.
本文提出了iTransformer,一种用于时间序列预测的新型架构,采用了与标准 Transformer 结构相反的设计。iTransformer 不是在每个时间戳嵌入多个变量,而是将每个时间序列单独嵌入为一个标记,使用注意力机制捕捉多变量之间的相关性,并通过前馈网络学习序列特定的表示。该方法在多个真实世界数据集上取得了最先进的结果,展示了相比现有基于 Transformer 的模型和线性模型,特别是在较长回溯窗口下的性能提升和泛化能力。作者提供了大量实验结果和分析来支持他们的论点。
原文链接:arxiv.org