Financial
Time Series Forecasting is an important tool to support both individual and
organizational decisions. Periodic phenomena are very popular in econometrics.
Many models have been built aiding capture of these periodic trends as a way of
enhancing forecasting of future events as well as guiding business and social
activities. The nature of real-world systems is characterized by many uncertain fluctuations which
makes prediction difficult. In situations when randomness is mixed with
periodicity, prediction is even much harder. We therefore constructed an ANN
Time Varying Garch model with both linear and non-linear attributes and
specific for processes with fixed and random periodicity. To eliminate the need
for time series linear component filtering, we incorporated the use of Artificial Neural Networks
(ANN) and constructed Time Varying GARCH model on its disturbances. We
developed the estimation procedure of the ANN time varying GARCH model
parameters using non parametric techniques.
References
[1]
Adhikari, R. and Agrawal, R. (2013) An Introductory Study on Time Series Modeling and Forecasting.
[2]
Frances, P. (1996) Periodicity and Stochastic Trends in Economic Time Series. OUP Catalogue, Oxford University Press, Oxford.
[3]
Zhang, G. (2003) Time Series Forecasting Using a Hybrid ARIMA and Neural Network Model. Neurocomputing, 50, 159-175.
https://doi.org/10.1016/S0925-2312(01)00702-0
[4]
Liu (2019) ARMA Model for Random Periodic Processes. Loughborough University, Loughborough.
[5]
Feng and Zhao (2015) Random Periodic Processes, Periodic Measures and Ergodicity.
[6]
Boger, Z. and Guterman, H. (1997) Knowledge Extraction from Artificial Neural Network Models. 1997 IEEE International Conference on Systems, Man, and Cybernetics. Computational Cybernetics and Simulation, Orlando, 12-15 October 1997, Vol. 4, 3030-3035.
[7]
Kihoro, J., Otieno, R. and Wafula, C. (2004) Seasonal Time Series Forecasting: A Comparative Study of ARIMA and ANN Models. African Journal of Science and Technology, 5, 41-49.
[8]
Bishop Christopher, M., et al. (1995) Neural Networks for Pattern Recognition. Oxford University Press, Oxford.
[9]
Bouwmans, T., Javed, S., et al. (2019) Deep Neural Network Concepts for Background Subtraction: A Systematic Review and Comparative Evaluation. Neural Networks, 117, 8-66. https://doi.org/10.1016/j.neunet.2019.04.024
[10]
Dahlhaus, R., Rao, S., et al. (2006) Statistical Inference for Time-Varying ARCH Processes. Annals of Statistics, 34, 1075-1114.
https://doi.org/10.1214/009053606000000227
[11]
Xu, S. and Chen, L. (2008) A Novel Approach for Determining the Optimal Number of Hidden Layer Neurons for FNN’s and Its Application in Data Mining.
[12]
Blum, A. (1992) Neural Networks in C++ an Object-Oriented Framework for Building Connectionist Systems. John Wiley and Sons, Inc., Hoboken.
[13]
Ash, T. (1989) Dynamic Node Creation in Back-Propagation Networks. Connection Science, 1, 365-375. https://doi.org/10.1080/09540098908915647
[14]
Hirose, Y., Yamashita, K. and Hijiya, S. (1991) Back-Propagation Algorithm Which Varies the Number of Hidden Units. Neural Networks, 1, 61-66.
https://doi.org/10.1016/0893-6080(91)90032-Z
[15]
Rivaks, I. and Personnaz, L. (2000) A Statistical Procedure for Determining the Optimal Number of Hidden Neurons of a Neural Model. Second International Symposium on Neural Computation (NC’2000), Berlin, 23-26 May 2000, 14-17.
[16]
Rohan, N. and Ramathan, T. (2013) Nonparametric Estimation of a Time-Varying GARCH Model. Journal of Nonparametric Statistics, 25, 33-52.
https://doi.org/10.1080/10485252.2012.728600
[17]
Gajewicz, S., Kar, S. and Piotrowska, M. (2021) The Kernel-Weighted Local Polynomial Regression (KwLPR) Approach: An Efficient, Novel Tool for Development of QSAR/QSAAR Toxicity Extrapolation Models. Journal of Cheminformatics, 13, 1-20. https://doi.org/10.1186/s13321-021-00484-5