fbpx
Wikipedia

Berndt–Hall–Hall–Hausman algorithm

The Berndt–Hall–Hall–Hausman (BHHH) algorithm is a numerical optimization algorithm similar to the Newton–Raphson algorithm, but it replaces the observed negative Hessian matrix with the outer product of the gradient. This approximation is based on the information matrix equality and therefore only valid while maximizing a likelihood function.[1] The BHHH algorithm is named after the four originators: Ernst R. Berndt, Bronwyn Hall, Robert Hall, and Jerry Hausman.[2]

Usage edit

If a nonlinear[disambiguation needed] model is fitted to the data one often needs to estimate coefficients through optimization. A number of optimisation algorithms have the following general structure. Suppose that the function to be optimized is Q(β). Then the algorithms are iterative, defining a sequence of approximations, βk given by

 ,

where   is the parameter estimate at step k, and   is a parameter (called step size) which partly determines the particular algorithm. For the BHHH algorithm λk is determined by calculations within a given iterative step, involving a line-search until a point βk+1 is found satisfying certain criteria. In addition, for the BHHH algorithm, Q has the form

 

and A is calculated using

 

In other cases, e.g. Newton–Raphson,   can have other forms. The BHHH algorithm has the advantage that, if certain conditions apply, convergence of the iterative procedure is guaranteed.[citation needed]

See also edit

References edit

  1. ^ Henningsen, A.; Toomet, O. (2011). "maxLik: A package for maximum likelihood estimation in R". Computational Statistics. 26 (3): 443–458 [p. 450]. doi:10.1007/s00180-010-0217-1.
  2. ^ Berndt, E.; Hall, B.; Hall, R.; Hausman, J. (1974). "Estimation and Inference in Nonlinear Structural Models" (PDF). Annals of Economic and Social Measurement. 3 (4): 653–665.

Further reading edit

  • V. Martin, S. Hurn, and D. Harris, Econometric Modelling with Time Series, Chapter 3 'Numerical Estimation Methods'. Cambridge University Press, 2015.
  • Amemiya, Takeshi (1985). Advanced Econometrics. Cambridge: Harvard University Press. pp. 137–138. ISBN 0-674-00560-0.
  • Gill, P.; Murray, W.; Wright, M. (1981). Practical Optimization. London: Harcourt Brace.
  • Gourieroux, Christian; Monfort, Alain (1995). "Gradient Methods and ML Estimation". Statistics and Econometric Models. New York: Cambridge University Press. pp. 452–458. ISBN 0-521-40551-3.
  • Harvey, A. C. (1990). The Econometric Analysis of Time Series (Second ed.). Cambridge: MIT Press. pp. 137–138. ISBN 0-262-08189-X.

berndt, hall, hall, hausman, algorithm, berndt, hall, hall, hausman, bhhh, algorithm, numerical, optimization, algorithm, similar, newton, raphson, algorithm, replaces, observed, negative, hessian, matrix, with, outer, product, gradient, this, approximation, b. The Berndt Hall Hall Hausman BHHH algorithm is a numerical optimization algorithm similar to the Newton Raphson algorithm but it replaces the observed negative Hessian matrix with the outer product of the gradient This approximation is based on the information matrix equality and therefore only valid while maximizing a likelihood function 1 The BHHH algorithm is named after the four originators Ernst R Berndt Bronwyn Hall Robert Hall and Jerry Hausman 2 Contents 1 Usage 2 See also 3 References 4 Further readingUsage editIf a nonlinear disambiguation needed model is fitted to the data one often needs to estimate coefficients through optimization A number of optimisation algorithms have the following general structure Suppose that the function to be optimized is Q b Then the algorithms are iterative defining a sequence of approximations bk given by b k 1 b k l k A k Q b b k displaystyle beta k 1 beta k lambda k A k frac partial Q partial beta beta k nbsp where b k displaystyle beta k nbsp is the parameter estimate at step k and l k displaystyle lambda k nbsp is a parameter called step size which partly determines the particular algorithm For the BHHH algorithm lk is determined by calculations within a given iterative step involving a line search until a point bk 1 is found satisfying certain criteria In addition for the BHHH algorithm Q has the form Q i 1 N Q i displaystyle Q sum i 1 N Q i nbsp and A is calculated using A k i 1 N ln Q i b b k ln Q i b b k 1 displaystyle A k left sum i 1 N frac partial ln Q i partial beta beta k frac partial ln Q i partial beta beta k right 1 nbsp In other cases e g Newton Raphson A k displaystyle A k nbsp can have other forms The BHHH algorithm has the advantage that if certain conditions apply convergence of the iterative procedure is guaranteed citation needed See also editDavidon Fletcher Powell DFP algorithm Broyden Fletcher Goldfarb Shanno BFGS algorithmReferences edit Henningsen A Toomet O 2011 maxLik A package for maximum likelihood estimation in R Computational Statistics 26 3 443 458 p 450 doi 10 1007 s00180 010 0217 1 Berndt E Hall B Hall R Hausman J 1974 Estimation and Inference in Nonlinear Structural Models PDF Annals of Economic and Social Measurement 3 4 653 665 Further reading editV Martin S Hurn and D Harris Econometric Modelling with Time Series Chapter 3 Numerical Estimation Methods Cambridge University Press 2015 Amemiya Takeshi 1985 Advanced Econometrics Cambridge Harvard University Press pp 137 138 ISBN 0 674 00560 0 Gill P Murray W Wright M 1981 Practical Optimization London Harcourt Brace Gourieroux Christian Monfort Alain 1995 Gradient Methods and ML Estimation Statistics and Econometric Models New York Cambridge University Press pp 452 458 ISBN 0 521 40551 3 Harvey A C 1990 The Econometric Analysis of Time Series Second ed Cambridge MIT Press pp 137 138 ISBN 0 262 08189 X Retrieved from https en wikipedia org w index php title Berndt Hall Hall Hausman algorithm amp oldid 1220082025, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.