Approaching the error of network models applied to the forecast of time serie
Published 2011-06-15
Keywords
- nonlinear models,
- multilayer perceptrons,
- ARIMA models,
- exponential smoothing,
- forecasting
How to Cite
Abstract
Artificial neural networks are an important technique in nonlinear time series forecasting. However, training ofneural networks is a difficult task, because of the presence of many local optimal points and the irregularity ofthe error surface. In this context, it is very easy to obtain under-fitted or over-fitted forecasting models withoutforecasting power. Thus, researchers and practitioner need to have criteria for detecting this class of problems. Inthis paper, we demonstrate that the use of well known methodologies in linear time series forecasting, such as theBox-Jenkins methodology or exponential smoothing models, are valuable tools for detecting bad specified neuralnetwork models.
Downloads
References
- D. van Djck Smooth Transition Models: Extensions and Outlier Robust Inference [PhD thesis]. Erasmus University - Rotterdam. 1999.
- M.P. Clements, P.H. Frances and N.R. Swanson. “Forecasting economic and financial time-series with non-linear models”. International Journal of Forecasting, vol. 20, 2004, pp.168-183
- G. Zhang, B. Patuwo and M. Hu. “Forecasting with artificial neural networks: the state of the art”, International Journal of Forecasting, vol. 14, 1998, pp. 35–62.
- I. Kaastra, and M. Boyd. “Designing a neural network for forecasting financial and economic series”, Neurocomputing, vol. 10, 1996, pp. 215– 236.
- Y. LeCun, L. Bottou, G.B. Orr and K.-R. Muller. “Efficient Backprop. En Neural Networks - Tricks of the Trade”. Springer Lecture Notes in Computer Sciences 1524, 1988, pp. 5-50.
- U. Anders and O. Korn. “Model selection in neural networks”, Neural Networks, vol. 12, 1999, pp. 309-323.
- S.G. Makridakis, S.C. Wheelwright and R.J. Hyndman. Forecasting: Methods and applications. 3rd edition. New York. John Wiley & Sons. 1998.
- W. Sarle. “Neural networks & and statistical models”, Proceedings of the Nineteenth Annual SAS Users Group International Conference. April, 1994 The 19th Annual SAS Users Group Int. Conference. Cary, NC: The SAS Institute, 1994, pp. 1538-1550.
- G. Cybenko. “Approximation by superpositions of a sigmoidal function”, Mathematics of Control: Signals and Systems, vol. 2, 1989, pp. 202–314.
- T. Masters, T. Practical Neural Network Recipes in C++ (1ra. Ed). San Diego, CA, USA Academic Press Professional, Inc. 1993.
- T. Masters. Neural, Novel and Hybrid Algorithms for Time Series Prediction (1ra ed.). New York, NY, USA. John Wiley and Sons. 1995.
- P. Sánchez y J.D. Velásquez. “El rol del algoritmo de entrenamiento en la selección de modelos de redes neuronales”. Revista U.D.C.A. Actualidad & Divulgación Científica, vol. 14, no. 1, 2011, pp. 149-156.
- M. Ghiassi, H. Saidane and D.K. Zimbra. A dynamic artificial neural network model for forecasting time series events. International Journal of Forecasting, vol. 21, 2005. pp. 341- 362.
- M. Pant, R. Thangaraj, R. y A. Abraham, “DEPSO: a new hybrid meta-heuristic for solving global optimization problems”. New Mathematics and Natural Computation, vol. 7, no. 3, 2011, pp. 363-381.
- SAS. The SAS System 8e for MS-Windows, Release 8.02, SAS Institute Inc. 2001
- SPSS. Clementine Data Mining System, Version 5, SPSS, Inc. 1998.
- R.J. Hyndman and Y. Khandakar. “Automatic Time Series Forecasting: The forecast Package for R”. Journal of statistical software, vol. 27, No. 3, July 2008.