# R language simulates arch process model to analyze the stationarity and volatility of time series

Time：2022-5-16

In the process of the development of things, it often shows complex fluctuations, real-time and slow fluctuations, and volatility clustering often occurs, which is often encountered in risk research. Engle proposed the autoregressive conditional heteroskedasticity model arch (autoregressive conditional heteroskedasticity model) to describe variance fluctuation in 1982. It developed from bollerslev (T., 1986) to generalized autoregressive conditional heteroscedasticity GARCH (generalized arch), and later developed into many special forms.

In the context of AR (1) process, we spent some time explaining when What happens close to 1.

• If The process is smooth,
• If The process is random walk
• If This process will fluctuate greatly

Similarly, random walk is a very interesting process with puzzling characteristics. For example, As And the process will pass through infinite times_ x_ Axis

We carefully study the properties of arch (1) process, especially when , the results we get may be puzzling.

Consider some arch (1) processes , with Gaussian noise, i.e among  Is an IID sequence Variable. Here And Must be positive.

Review # due to  . therefore  So the variance exists, and only if , in this case In addition, if , you can get the fourth moment,  Now, if we go back to the attribute obtained when studying variance, if , or ?

If we look at the simulation, we can generate an arch (1) process, for example ``````> ea=rnorm
> eson=rnorm
> sga2=rep
> for(t in 2:n){

> plot`````` In order to understand what happened, we should remember that our good thing is, Must be Can be calculated between The second moment. However, there may be a stationary process with infinite variation. iteration Iterating over and over again among Here, we have a sum of positive terms, and we can use the so-calledCauchy rule: definition So, if ， Convergence. here, It can also be written as And according to the law of large numbers, because we have a sum of independent and identically distributed terms, Therefore, if , then There will be restrictions when Take infinity.

The above conditions can be written as This is calledLyapunovCoefficient.

equation yes One condition

In this case , the value of this upper bound is 3.56.

``> 1/exp(mean(log(rnorm(1e7)^2)))`` In this case（ ）, the variance may be infinite, but the sequence is stationary. On the other hand, if , then It’s almost certain to go to infinity because Towards infinity.

But in order to observe this difference, we need a lot of observation. For example  And , We can easily see the difference. I’m not saying it’s easy to see that the above distribution has infinite variance, but it’s still so. In fact, if we consider Hill’s picture in the above series, it’s on the front tail of

In fact, if we consider the Hill plot of the above series, in the positive Tail of

``> hil`` Or negative Tail of

``-epsilon`` We can see that the tail index (strictly speaking) is less than 2 (which means that the second-order moment does not exist).

Why is it puzzling? Maybe it’s because of here Not weakly stationary (at In a sense), but strong and stable. This is not the usual weak and strong relationship. This may be why we call it strict stationarity instead of strong stationarity. Most popular insights

## ThinkPHP V6. Release 0.9 – general update

V6. Version 0.9 is a regular update, which mainly adds wildcard support for event listening, fixes a possible serialization vulnerability of the framework, and makes some improvements and optimizations to the model. Major updates Update League / flysystem version Event listening supports wildcards Support unified configuration of time field Improved all method of request class […]