ผลต่างระหว่างรุ่นของ "Probstat/notes/regression"

จาก Theory Wiki
ไปยังการนำทาง ไปยังการค้นหา
 
(ไม่แสดง 33 รุ่นระหว่างกลางโดยผู้ใช้คนเดียวกัน)
แถว 17: แถว 17:
  
 
== The least squares estimators ==
 
== The least squares estimators ==
 +
Denote our estimate for <math>\alpha</math> as <math>A</math> and for <math>\beta</math> as <math>B</math>.  Using both variables as estimator, the error at data point <math>(x_i,y_i)</math>, the error is
 +
 +
<center>
 +
<math>y_i - (A + B x_i) = y_i - A - B x_i</math>.
 +
</center>
 +
 +
We focus more on the sum of squared errors, i.e.,
 +
 +
<center>
 +
<math>SS = \sum_{i=1}^n (y_i - A - Bx_i)^2</math>.
 +
</center>
 +
 +
'''The method of least squares''' use the parameters that minimize the squared errors as an estimator.  Therefore, we want to find <math>A</math> and <math>B</math> that minimize <math>SS</math>.  To do so, we partially differentiate <math>SS</math> with respect to <math>A</math> and <math>B</math>:
 +
 +
<center>
 +
<math>\frac{\partial}{\partial A}SS = -2\sum_{i=1}^n (y_i - A - Bx_i)</math> &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (Eq1)
 +
</center>
 +
 +
<center>
 +
<math>\frac{\partial}{\partial B}SS = -2\sum_{i=1}^n x_i(y_i - A - Bx_i)</math> &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (Eq2)
 +
</center>
 +
 +
We set these two equations to zero to find the maximum and obtain these two equations we have to solve.
 +
 +
<center>
 +
<math>\sum_{i=1}^n y_i = nA + B\sum_{i=1}^n x_i</math>
 +
</center>
 +
 +
<center>
 +
<math>\sum_{i=1}^n x_iy_i = A\sum_{i=1}^n x_i + B\sum_{i=1}^n x_i^2</math>
 +
</center>
 +
 +
Before solving these two equations, let's define
 +
 +
<center>
 +
<math>\bar{y} = \sum_{i=1}^n y_i/n, \ \ \ \  \bar{x} = \sum_{i=1}^n x_i/n.</math>
 +
</center>
 +
 +
We start by rewriting the first equation (Eq1) as
 +
 +
<center>
 +
<math>A = \bar{y} - B\bar{x},</math> &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; (Eq3)
 +
</center>
 +
 +
and put it in (Eq2) to get
 +
 +
<center>
 +
<math>\sum_{i=1}^n x_iy_i = (\bar{y} - B\bar{x})\sum_{i=1}^n x_i + B\sum_{i=1}^n x_i^2</math>.
 +
</center>
 +
 +
With some calculation, we get
 +
 +
<center>
 +
<math>B = \frac{\sum_{i=1}^n x_iy_i -n\bar{x}\bar{y}}{\sum_{i=1}^n x_i^2 - n\bar{x}^2}</math>.
 +
</center>
 +
 +
To find <math>A</math>, we can just use equation (Eq3).
 +
 +
{{กล่องเทา|'''Estimated regression parameters.'''  Using the least squares method, we obtain the following estimates
 +
 +
<center>
 +
<math>B = \frac{\sum_{i=1}^n x_iy_i -n\bar{x}\bar{y}}{\sum_{i=1}^n x_i^2 - n\bar{x}^2}</math>,
 +
</center>
 +
 +
and
 +
 +
<center>
 +
<math>A = \bar{y} - B\bar{x}</math>.
 +
</center>
 +
}}
 +
 
== Distribution of regression parameters ==
 
== Distribution of regression parameters ==
 +
Although the estimators <math>A</math> and <math>B</math> are least-squares estimators, we are not sure if they are good estimators.  In this section, we shall discuss various properties of these parameters.
 +
 +
We make an assumption on the error, that is is normally distributed with variance <math>\sigma^2</math>.  Therefore,
 +
 +
<center>
 +
<math>y_i \sim Normal(\alpha + \beta x_i, \sigma^2)</math>.
 +
</center>
 +
 +
We shall start with <math>B</math>.  First, note that <math>x_i</math>'s are inputs and are not random.  Therefore, if we look at the formula for <math>B</math>, we see that <math>B</math> is actually a sum of independent normal random variables.  This implies that <math>B</math> is a normal random variable.  If we can find its mean and its variance, we have a complete information about the distribution of <math>B</math>.
 +
 +
We can calculate the expectation and variance of <math>B</math> as follows.
 +
 +
* <math>\mathrm{E}[B] = \beta</math>.
 +
 +
* <math>Var(B) = \frac{\sigma^2}{\sum_{i=1}^n x_i^2 - n\bar{x}^2}</math>.
 +
 +
We can also calculate the expectation and variance of <math>A</math>.
 +
 +
* <math>E[A] = \alpha</math>.
 +
 +
* <math>Var(A) = \frac{\sigma^2\sum_{i=1}^n x_i^2}{n(\sum_{i=1}^n x_i^2 - n\bar{x}^2)}</math>.
 +
 
== Statistical tests on regression parameters ==
 
== Statistical tests on regression parameters ==
 +
We focus on how to test the null hypothesis:
 +
 +
<center>
 +
<math>H_0: \ \ \ \beta = 0.</math>
 +
</center>
 +
 +
Since <math>B</math> is normal with mean <math>\beta</math> and variance <math>\sigma^2/(\sum_{i=1}^n x_i^2 - n\bar{x}^2)</math>, we know that the statistic
 +
 +
<center>
 +
<math>\frac{B - \beta}{\sigma/\sqrt{(\sum_{i=1}^n x_i^2 - n\bar{x}^2)}}</math>
 +
</center>
 +
 +
is unit normal and it is possible to perform various statistical tests on <math>\beta</math> based on the estimated value <math>B</math> if we know parameter <math>\sigma^2</math>.  However, usually, we do not.
 +
 +
We end up with a situation similar to when we perform sampling on populations with unknown variances.  Another key quantity in this case is the sum of squares of the residuals:
 +
 +
<center>
 +
<math>SS_R = \sum_{i=1}^n (y_i - A - Bx_i)^2</math>
 +
</center>
 +
 +
Note that if we substitute <math>A</math> with <math>\alpha</math> and <math>B</math> with <math>\beta</math> the term <math>y_i - \alpha - \beta x_i</math> is exactly the errors (which is normally distributed with mean 0 and variance <math>\sigma^2</math>).  This motivate the fact that <math>SS_R</math> can be used to estimate <math>\sigma^2</math>:
 +
 +
<center>
 +
<math>E\left[\frac{SS_R}{n-2}\right] = \sigma^2</math>.
 +
</center>
 +
 +
More over, it can be shown that
 +
 +
<center>
 +
<math>\frac{SS_R}{\sigma^2} \sim \chi^2_{n-2}</math>,
 +
</center>
 +
 +
and <math>SS_R</math> is independent of <math>B</math>.  These two facts implies that
 +
 +
<center>
 +
<math>
 +
\frac{\left(\frac{B - \beta}{\sigma/\sqrt{(\sum_{i=1}^n x_i^2 - n\bar{x}^2)}}\right)}
 +
{\sqrt{\frac{SS_R}{\sigma^2(n-2)}}}
 +
=
 +
\frac{\left(\frac{B - \beta}{1/\sqrt{(\sum_{i=1}^n x_i^2 - n\bar{x}^2)}}\right)}
 +
{\sqrt{\frac{SS_R}{(n-2)}}}
 +
=
 +
\frac{B - \beta}{\sqrt{\frac{SS_R}{(n-2)S_{xx}}}}
 +
\sim t_{n-2},
 +
</math>
 +
</center>
 +
 +
where <math>S_{xx} = \sum_{i=1}^n x_i^2 - n\bar{x}^2</math>.
 +
 +
Therefore, if we want to perform a hypothesis testing if <math>\beta=0</math>, we can check if <math>\frac{B - \beta}{\sqrt{\frac{SS_R}{(n-2)S_{xx}}}}</math> deviate far enough from the ''t''-distribution with <math>n-2</math> degrees of freedom.
 +
 +
'''Notes:''' This is fairly similar to the use of the ''t''-distribution for the sample mean when the variance of the population is unknown, where the quantity <math>SS_R</math> acts as <math>S^2</math> in that case.

รุ่นแก้ไขปัจจุบันเมื่อ 01:35, 8 ธันวาคม 2557

This is part of probstat.

In this section, we shall discuss linear regression. We shall focus on one-variable linear regression.

Model

We consider two variables and where is a function of . We refer to as independent or input variable, and as a dependent variable. We consider linear relationship between independent variable and dependent variable. We assume that there exist hidden variables and such that

where is a random error. We further assume that the error is unbiased, i.e., and is independent of .

Input: As an input to the regression process, we are given a set of data points: generated from the previous equation.

Goal: We want to estimate and .

The least squares estimators

Denote our estimate for as and for as . Using both variables as estimator, the error at data point , the error is

.

We focus more on the sum of squared errors, i.e.,

.

The method of least squares use the parameters that minimize the squared errors as an estimator. Therefore, we want to find and that minimize . To do so, we partially differentiate with respect to and :

             (Eq1)

             (Eq2)

We set these two equations to zero to find the maximum and obtain these two equations we have to solve.

Before solving these two equations, let's define

We start by rewriting the first equation (Eq1) as

             (Eq3)

and put it in (Eq2) to get

.

With some calculation, we get

.

To find , we can just use equation (Eq3).

Estimated regression parameters. Using the least squares method, we obtain the following estimates

,

and

.

Distribution of regression parameters

Although the estimators and are least-squares estimators, we are not sure if they are good estimators. In this section, we shall discuss various properties of these parameters.

We make an assumption on the error, that is is normally distributed with variance . Therefore,

.

We shall start with . First, note that 's are inputs and are not random. Therefore, if we look at the formula for , we see that is actually a sum of independent normal random variables. This implies that is a normal random variable. If we can find its mean and its variance, we have a complete information about the distribution of .

We can calculate the expectation and variance of as follows.

  • .
  • .

We can also calculate the expectation and variance of .

  • .
  • .

Statistical tests on regression parameters

We focus on how to test the null hypothesis:

Since is normal with mean and variance , we know that the statistic

is unit normal and it is possible to perform various statistical tests on based on the estimated value if we know parameter . However, usually, we do not.

We end up with a situation similar to when we perform sampling on populations with unknown variances. Another key quantity in this case is the sum of squares of the residuals:

Note that if we substitute with and with the term is exactly the errors (which is normally distributed with mean 0 and variance ). This motivate the fact that can be used to estimate :

.

More over, it can be shown that

,

and is independent of . These two facts implies that

where .

Therefore, if we want to perform a hypothesis testing if , we can check if deviate far enough from the t-distribution with degrees of freedom.

Notes: This is fairly similar to the use of the t-distribution for the sample mean when the variance of the population is unknown, where the quantity acts as in that case.