|
Methods for estimating the coefficients in the simple linear regression model are investigated. The least squares method is a purely mathematical approach, requiring no statistical assumptions other…
|
|
In this lesson, some approximate and asymptotic versions of confidence sets are explored. The purpose here is to illustrate some methods that will be of use in more complicated situations, methods…
|
|
There are several properties to be considered when considering a point estimator from an asymptotic perspective. In this lesson, the concepts of consistency and efficiency are investigated. As…
|
|
A confidence interval estimator for the shift parameter is found by inverting the CDF of a sufficient statistic for the parameter.
|
|
A confidence interval estimator for the variance of a normal distribution is found using a pivotal quantity.
|
|
In this example, finding an upper confidence bound for the mean of a normal population is illustrated. The method used is the inversion of the acceptance region of the UMP test from the Karlin Rubin…
|
|
The likelihood ratio method of hypothesis testing is related to maximum likelihood estimators. Likelihood ratio tests are as widely used as maximum likelihood estimation. In this lesson, we provide…
|
|
This example illustrates how to find the Bayes estimator, under squared error loss function, of the rate of a Poisson distribution if the prior distribution on the rate is a gamma distribution.
|
|
Example to illustrate finding the maximum likelihood estimator of the rate of a Poisson distribution.
|
|
This example illustrates how to find the maximum likelihood estimator (MLE) of the upper bound of a uniform(0, B) distribution. In this example, calculus cannot be used to find the MLE since the…
|
|
This lesson reviews point estimation from a frequentist perspective. Method of moments and maximum likelihood methods are reviewed. Following that is a discussion of decision theory, optimal…
|
|
In the previous lesson on best unbiased estimators and the Cramer-Rao Lower Bound, the concept of sufficiency was not used. We will now consider how sufficiency is a powerful tool in our search for…
|
|
As we saw in the previous lesson, a comparison of estimators based on the mean squared error (MSE) may not yield a clear favorite. It turns out there is no "one best MSE" estimator. The…
|
|
The Bayesian approach to statistics is fundamentally different from the classical approach that we have been discussing. However, some aspects of the Bayesian approach can be quite helpful to other…
|
|
The method of maximum likelihood is, by far, the most popular technique for deriving estimators. It is based on finding a function of the data at which the likelihood achieves a global maximum. In…
|
|
An overview of the next two weeks' lessons is provided. Following that, a brief overview of point estimation is discussed
|
|
In this lesson, the effect of allowing the sample size n to increase to infinity is considered. Although this idea is not practical, it does provide useful approximations for the finite-sample case.…
|
|
When a random sample is drawn, some summary of the values is usually computed. Any well-defined summary may be expressed mathematically as a function of an n-dimension vector. The domain of that…
|