I have forgotten
my Password

Or login with:

  • Facebookhttp://facebook.com/
  • Googlehttps://www.google.com/accounts/o8/id
  • Yahoohttps://me.yahoo.com

Univariate Regression

Univariate regression, polynomial regression, orthogonal polynomials, nonlinear
+ View other versions (3)

Overview

Univariate regression is an area of curve-fitting which, given a function \inline f depending on some parameters, finds the parameters such that \inline f provides the best fit to a series of two-dimensional data points, in a certain sense. It is called univariate as the data points are supposed to be sampled from a one-variable function. Compare this to multivariate regression, which aims at fitting data points sampled from a function of several variables.

Formally speaking, consider a series of \inline N data points \inline (x_1, y_1), (x_2, y_2), \ldots, (x_N, y_N) and, for the sake of simplicity, consider that \inline x_1 < x_2 < \ldots < x_N, i.e. the points are distinct and are in increasing order with respect to \inline x. By doing least squares fitting on these data points we mean finding the parameters \inline \alpha_1, \alpha_2, \ldots, \alpha_k of a function \inline f_{\displaystyle \alpha_1, \alpha_2, \ldots, \alpha_k} : [x_1, x_n] \to \mathbb{R} such that the sum of squared residuals

is minimized. Provided that \inline f depends linearly on its parameters, the method is called linear regression, otherwise it is called nonlinear regression. For example, straight line regression, parabolic regression and polynomial regression are all linear regression models since the function \inline f is of the form

which clearly depends linearly on its parameters. As opposed to this, logistic regression, for example, is a nonlinear regression model since the fitting function is of the form

which is a nonlinear function of \inline \alpha_1 and \inline \alpha_2.

In the following, let us consider each case and briefly explain how the corresponding optimal parameters can be derived.

Polynomial Regression

One linear regression method is that of polynomial regression, which refers to finding the polynomial function that provides the least squares fitting to a series of data points. More precisely, if the polynomial function \inline f of degree \inline k is given by

then the optimal parameters \inline \alpha_0, \alpha_1, \ldots, \alpha_k can be found by solving the following system of linear equations:

For \inline k = 1, the function \inline f only depends on two parameters

and, in this case, the method is called straight line regression. The previous linear system becomes

which gives the optimal parameters (as its solution)

where \inline \overline{x} and \inline \overline{y} are the averages:

For example, consider the set of 7 points:

Then the optimal values of the parameters are

which gives the fitting line \inline f(x) = 3.46212 + 0.904273\ x. The following graph shows the data points and the regression line.

MISSING IMAGE!

378/reglinear-378.png cannot be found in /users/378/reglinear-378.png. Please contact the submission author.

Straight line regression is also implemented as the component Regression/Linear.

The optimal parameters for the case \inline k = 2 can be found in a similar manner. Instead of going into further technical details, let us look at the following graph which shows the quadratic regression polynomial in red for the series of data points shown in blue, sampled from a sine-like function.

MISSING IMAGE!

378/parabolic-378.png cannot be found in /users/378/parabolic-378.png. Please contact the submission author.

Notice that the resulting fitting curve looks more like a parabola and this is due to the fact that we may have chosen the wrong family of functions to provide the fit (polynomial ones), or perhaps the degree of the regression polynomial is too small. Parabolic regression is available as Regression/Parabolic.

For the general case of regression polynomials of arbitrary degree, the component Regression/Discrete is available.

Orthogonal Polynomial Regression

This is an extension of polynomial regression, in the sense that instead of using \inline x^0, x^1, x^2, \ldots, x^k as the factors in the fitting function \inline f, some special kind of functions \inline \phi_0(x), \phi_1(x), \ldots, \phi_k(x) are used and thus the expression for \inline f becomes:

More precisely, the functions \inline \phi_0, \phi_1, \ldots\ should be polynomial, listed in increasing order of their degree and also with the property that

for any \inline i \neq j, where \inline w : [x_1, x_n] \to \mathbb{R} is some weight function. In other words, the sequence of polynomial functions \inline \phi_0, \phi_1, \ldots\ is orthogonal.

Apart from this, the regression using orthogonal polynomials uses the same idea of minimizing the sum of squared residuals. It is also implemented as the component Regression/Orthogonal.

Below is a graph of a series of data points, shown in blue, and their corresponding fitting function, determined by the method of orthogonal polynomial regression.

MISSING IMAGE!

378/orthogonal-378.png cannot be found in /users/378/orthogonal-378.png. Please contact the submission author.

We may as well use certain types of orthogonal polynomials, instead of general ones as in the above method. For instance, we can use the Forsythe orthogonal polynomials, leading to the method of Forsythe polynomial regression, available as Regression/Forsythe.

In the image below, we have used Forsythe regression on the same series of data points as in the previous graph.

MISSING IMAGE!

378/forsythe-378.png cannot be found in /users/378/forsythe-378.png. Please contact the submission author.

Nonlinear Regression

As mentioned at the beginning of this reference page, nonlinear regression refers to the cases when the fitting function depends nonlinearly on its parameters. Therefore \inline f can virtually be any function with this property. Depending on the experiment from which the data points were obtained and the statistical properties of the phenomenon which took place, we may choose between various families of nonlinear regression functions.

To name a few nonlinear regression models, logistic regression is given by the fitting function

and is also implemented as Regression/Logistic. In the following image you can see the logistic regression curve in red, for the series of data points shown in blue.

MISSING IMAGE!

378/reglogistic-378.png cannot be found in /users/378/reglogistic-378.png. Please contact the submission author.

Other nonlinear regression models include the Gompertz model, given by the fitting function

or Richard's model, given by:

Lucian Bentea (September 2008)

References