ผลต่างระหว่างรุ่นของ "Probstat/notes/parameter estimation"

จาก Theory Wiki
ไปยังการนำทาง ไปยังการค้นหา
แถว 13: แถว 13:
 
'''Definition:''' For a random variable <math>X</math>, <math>E[X^k]</math> is called the ''k''-th moment of <math>X</math>.  Note that the first moment is the mean <math>E[X]</math>.  The variance of a random variable depends on the first and the second moments.
 
'''Definition:''' For a random variable <math>X</math>, <math>E[X^k]</math> is called the ''k''-th moment of <math>X</math>.  Note that the first moment is the mean <math>E[X]</math>.  The variance of a random variable depends on the first and the second moments.
  
If we want to estimate a parameter <math>\zeta</math>, using the method of moments, we start by writing the parameter as a function of the moments, i.e.,
+
If we want to estimate a parameter <math>\theta</math>, using the method of moments, we start by writing the parameter as a function of the moments, i.e.,
  
 
<center>
 
<center>
<math>\zeta = g(E[X], E[X^2], E[X^3], \ldots, E[X^r])</math>
+
<math>\theta = g(E[X], E[X^2], E[X^3], \ldots, E[X^r])</math>
 
</center>
 
</center>
  

รุ่นแก้ไขเมื่อ 09:23, 6 ธันวาคม 2557

This is part of probstat.

Previously we tried to estimate the population means and variances using the sample means and variances. In this section, we shall see the justification why what we did makes sense.

There are many ways to estimate parameters.

Method of moments estimators

See also wikipedia article.

This is probably the simplest estimators. However, they are often biased (as we shall show in the example).

Definition: For a random variable , is called the k-th moment of . Note that the first moment is the mean . The variance of a random variable depends on the first and the second moments.

If we want to estimate a parameter , using the method of moments, we start by writing the parameter as a function of the moments, i.e.,

We then estimate the sample moments

for .

Maximum likelihood estimators

Bayes estimators