Time series forecasting is an important research topic in machine learning due to its prevalence in social and scientific applications. Multi-model forecasting paradigm, including model hybridization and model combination, is shown to be more effective than single-model forecasting in the M4 competition. In this study, we hybridize exponential smoothing with transformer architecture to capture both levels and seasonal patterns while exploiting the complex non-linear trend in time series data. We show that our model can capture complex trends and seasonal patterns with moderately improvement in comparison to the state-of-the-arts result from the M4 competition.
Below is a list of currently supported components:
- Naive: This benchmark model produces a forecast that is equal to the last observed value for a given time series.
- Seasonal Naive: This benchmark model produces a forecast that is equal to the last observed value of the same season for a given time series.
- Naive2: a popular benchmark model for time series forecasting that automatically adapts to the potential seasonality of a series based on an autocorrelation test. If the series is seasonal, the model composes the predictions of Naive and Seasonal Naive. Otherwise, the model predicts on the simple Naive.
- Exponential smoothing: Using multiplicative Holt-Winter exponential smoothing to capture the potential error, seasonal, and trend.
- Transformer: Using time-series transformer to optimize the trend.
To reproduce an experiment, run the following command:
source shell/add_pwd_to_pythonpath.sh
python examples/main.py --cfg examples/daily.yml
As the project is continue to evolve, please direct any question, feedback, or comment to [email protected].
This research has been conducted as a part of internship of Sang Truong at Cummins Inc. (Fall 2019, Winter 2020), Community Health Network Inc. (Fall 2020, Spring 2021), and as indepedent research at DePauw University under the mentorship of Professor Jeff Gropp (Spring 2018, Fall 2019, Spring 2019, Fall 2020, Spring 2021). We thank Shuto Araki for his collaboration during Spring 2018 on theory and implementation of ARIMA and KNN models. We thank Bu Tran for his work on testing the ESTransformer architectures during Spring 2021.
@inproceedings{
truong2021hybcast,
title={Time-series Forecasting},
author={Sang Truong and Jeffrey Gropp},
booktitle={},
year={2021},
url={}
}