STA 5352 "Theory of Statistics I" is the first course in a two-semester sequence on the theory of statistics for the Ph.D. in Statistics at Baylor University. The course is offered by the Department of Statistical Science. Course topics are the theory of probability and random variables, distribution and density functions, expectation, point and interval estimation, and sufficiency.
-
In previous lessons, evaluations of point estimators have been based on their mean squared error (MSE) performance. MSE is a special case of a function called a loss…
Loss Function Optimality
-
In the previous lesson on best unbiased estimators and the Cramer-Rao Lower Bound, the concept of sufficiency was not used. We will now consider how sufficiency is a…
Sufficiency and Unbiased
-
As we saw in the previous lesson, a comparison of estimators based on the mean squared error (MSE) may not yield a clear favorite. It turns out there is no "one…
Best Unbiased Estimator
-
The first two criteria for evaluating point estimators are the bias of the estimator and the mean squared error of the estimator. The goal is to chose the estimator…
Evaluating Point Estimators: Bias and Mean…
-
The Bayesian approach to statistics is fundamentally different from the classical approach that we have been discussing. However, some aspects of the Bayesian approach…
Bayes Estimators
-
The method of maximum likelihood is, by far, the most popular technique for deriving estimators. It is based on finding a function of the data at which the likelihood…
Maximum Likelihood Estimators
-
Method of moments is thought to be one of the oldest, if not the oldest method for finding point estimators. First introduced in 1887 by Chebychev in his proof on the…
Method of Moments Estimators
-
An overview of the next two weeks' lessons is provided. Following that, a brief overview of point estimation is discussed
Introduction to Point Estimation
-
The previous two lessons both describe data reduction principles in the following way. A function T(x) of the sample is specified, and the principle states that if x…
The Equivariance Principle
-
In this lesson, an important function in statistic, called the likelihood function, is introduced. This function can also be used to summarize data. There are many ways…
The Likelihood Principle
-
In this lesson, we briefly review the concept of sufficiency and quickly move to minimal sufficiency. Following this is a brief discussion of ancillary statistics. We…
Minimal Sufficiency, Ancillarity, and Completeness
-
The information in a random sample is used to make inferences about an unknown parameter theta. If the sample size n is large, then the observed sample is a long list of…
Data Reduction and Sufficiency
-
The previous lesson on convergence concepts primarily focused on results as they apply to the sample mean or to a standardized random variable having a limiting normal…
The Delta Method
-
In this lesson, the effect of allowing the sample size n to increase to infinity is considered. Although this idea is not practical, it does provide useful…
Convergence Concepts
-
Sample values such as the smallest, largest, or middle observations from a random sample can provide additional summary information. The minimum, maximum, or median are…
Order Statistics
Search for ""