Alternating Least Squares in Generalized Linear Models
- Authors: Minasyan A.G.1
-
Affiliations:
- Yerevan State University
- Issue: Vol 54, No 5 (2019)
- Pages: 302-312
- Section: Probability Theory and Mathematical Statistics
- URL: https://journals.rcsi.science/1068-3623/article/view/228379
- DOI: https://doi.org/10.3103/S1068362319050078
- ID: 228379
Cite item
Abstract
We derived a convergence result for a sequential procedure known as alternating maximization (minimization) to the maximum likelihood estimator for a pretty large family of models - Generalized Linear Models. Alternating procedure for linear regression becomes to the well-known algorithm of Alternating Least Squares, because of the quadraticity of log-likelihood function L(υ). In Generalized Linear Models framework we lose quadraticity of L(υ), but still have concavity due to the fact that error-distribution is from exponential family. Concentration property makes the Taylor approximation of L(υ) up to the second order accurate and makes possible the use of alternating minimization (maximization) technique. Examples and experiments confirm convergence result followed by the discussion of the importance of initial guess.
About the authors
A. G. Minasyan
Yerevan State University
Author for correspondence.
Email: arsh.minasyan@gmail.com
Armenia, Yerevan