Abstract
We analyze the properties of the Maximum Likelihood (ML) estimator when the underlying log-likelihood function is numerically maximized with the so-called zig-zag algorithm. By splitting the parameter vector into sub-vectors, the algorithm maximizes the log-likelihood function alternatingly with respect to one sub-vector while keeping the others constant. For situations when the algorithm is initialized with a consistent estimator and is iterated sufficiently often, we establish the asymptotic equivalence of the zig-zag estimator and the “infeasible” ML estimator being numerically approximated. This result gives guidance for practical implementations. We illustrate how to employ the algorithm in different estimation problems, such as in a vine copula model and a vector autoregressive moving average model. The accuracy of the estimator is illustrated through simulations. Finally, we demonstrate the usefulness of our results in an application, where the Bitcoin heating 2017 is analyzed by a dynamic conditional correlation model.
Users
Please
log in to take part in the discussion (add own reviews or comments).