Wednesday, October 12, 2011

QR-decomposition and least squares

http://www.itl.nist.gov/div898/handbook/pmd/section1/pmd141.htm

Least Squares process

Linear least squares regression is by far the most widely used modeling method. It is what most people mean when they say they have used "regression", "linear regression" or "least squares" to fit a model to their data.

Linear least squares regression also gets its name from the way the estimates of the unknown parameters are computed

f(x;\vec{\beta}) = \beta_0 + \beta_1x + \beta_{11}x^2

Least Square Problem Given an inconsistent system of equations, , we want to find a vector, , from so that the error is the smallest possible error. The vector is called the least squares solution.

---------

see lm() and lsfit() in R for least squares fitting procedure

---------

http://tutorial.math.lamar.edu/Classes/LinAlg/QRDecomposition.aspx

QR-Decomposition (see qr() in R)

There is a nice application of the QR-Decomposition to the Least Squares Process.

Theorem 3 Suppose that A has linearly independent columns. Then the normal system associated with Ax=b can be written as, Rx = t(Q)b

Theorem 1 Suppose that A is an n x m matrix with linearly independent columns then A can be factored as, A = QR

where Q is an n x m matrix with orthonormal columns and R is an invertible m x m upper triangular matrix.

No comments: