GARCH Parameter Estimation by Machine Learning |
( Volume 5 Issue 8,August 2018 ) OPEN ACCESS |
Author(s): |
Tetsuya Takaishi |
Abstract: |
It is of great importance to estimate volatility of asset returns for risk management in empirical finance. The GARCH model is often used to estimate volatility. To utilize the GARCH model, we need to estimate model parameters so that the model matches the underlying return time series. Usually the maximum likelihood or the Bayesian method is used for the parameter estimation of the GARCH model. In this study we apply the machine learning technique for the parameter estimation. We minimize the loss function defined by the likelihood function of the GARCH model. The minimization is done by the Adam optimizer of TensorFlow. We find that the machine learning estimates the model parameters correctly. We also investigate the convergence property of the Adam optimizer and show that the convergence rate increases as the learning rate increases up to a certain maximum learning rate. Over the maximum value, the minimization fails with the optimizer. |
DOI :
|
Paper Statistics: |
Cite this Article: |
Click here to get all Styles of Citation using DOI of the article. |