Long-term forecasting of time series is a long-standing problem in solving various applied problems.
Transformer-based models show promising results. However, high computational complexity and memory requirements make it difficult to use the
Transformer for modeling long sequences. This has given rise to numerous studies devoted to reducing computational costs of the
Transformer algorithm.
Despite the progress made by
Transformer-based time series forecasting methods based, in some cases