I'm working on a R project, trying to calibrate a GARCH (so far, (1,1) ) model to the yields of the STOXX50 index over the last 2 years.
I've tried the garch function of the tseries package, but it gave me a "false convergence" result. I tried then the ruGARCH package, and no false convergence so far, but I would like to know if my model is a good fit for the data.
How can I do that ? How can I interpret the results of all the tests done (Box-Liung, etc..)
Answer
To test for model misspeicfication:
First ensure that auto correlation of standardized residuals resulted from the ARMA-GARCH model are not significant. Further, you can use Box-Ljung test. It test joint significance of auto correlation upto lag $K$.
Leverage effect is tested by sign bias test. If $p$ value is less than .05 (assumed significance level) then it indicate presence of leverage effect in the data. In this case, try models that capture leverage effects like TGARCH, EGARCH etc.
The chi-squared goodness of fit test compares the empirical distribution of the standardized residuals with the theoretical ones from the chosen density.
But before fitting GARCH model, check for ARCH effects in your data. If there is no ARCH effect then GARCH model is not required at all.
No comments:
Post a Comment