I have forgotten
my Password

Or login with:

  • Facebookhttp://facebook.com/
  • Googlehttps://www.google.com/accounts/o8/id
  • Yahoohttps://me.yahoo.com

Valid RSS

Least Squares

Curve fitting, least squares, optimization
+ View version details


Consider a series of N given data points in (d + 1)-dimensional space, (x_1, y_1), (x_2, y_2), \ldots, (x_N, y_N), where x_i := (x_i^1, x_i^2, \ldots, x_i^d) is a vector, for all i = 1, 2, \ldots, N. Define a function f : \mathbb{R}^{k + d} \to \mathbb{R} through

where \psi_j : \mathbb{R} \to \mathbb{R} and \theta_j : \mathbb{R}^d \to \mathbb{R} are given functions, for all j = 1, 2, \ldots, k. Also, define the function S_f : \mathbb{R}^k \to \mathbb{R} as the sum of squared residuals:

The method of least squares fitting refers to finding the parameters (\alpha_1^*, \alpha_2^*, \ldots, \alpha_k^*) which are the solution to the following unconstrained optimization problem:

After solving this problem, the function f^* : \mathbb{R}^d \to \mathbb{R} which provides the best fit, in the least-squares sense, is given by:

This general framework allows us to classify the different types of regression, as follows:

  • When d = 1 the method is known as univariate regression, while if d > 1 we have multivariate regression.
  • Provided that all the functions \psi_1, \psi_2, \ldots, \psi_k are linear, the method is called linear regression, otherwise it is known as nonlinear regression.
  • Also, based on the type of the functions \phi_1, \phi_2, \ldots, \phi_k we may have polynomial regression, regression by orthogonal polynomials, and so on.

Solving The Problem

Since the sum of residuals function S_f is convex on its entire domain, a necessary and sufficient condition for a tuple of parameters (\alpha_1^*, \alpha_2^*, \ldots, \alpha_k^*) to be a solution to the above optimization problem is that

which can also be written as the system of (possibly nonlinear) equations:

Let us fix some value of j \in \{1, 2, \ldots, k\} and calculate \displaystyle \frac{\partial S_f}{\partial \alpha_j}. We have:

Therefore, the system of equations whose solution is the tuple of optimal parameters (\alpha_1^*, \alpha_2^*, \ldots, \alpha_k^*) can explicitly be written as:

In the case of linear regression in small dimensions, it is possible to solve this system directly, using algebra. Generally, however, we should use numerical root-finding techniques to find the optimal parameters.
Lucian Bentea (September 2008)