|
In this final topic, we consider using the estimated regression line to estimate the mean response at a given value of the explanatory (x) variable. The standard error of this estimated mean…
|
|
Up until now, we have modeled random variables with a probability mass function or a probability density function. We then discussed in detail the theory behind inference on the unknown parameters. …
|
|
This example illustrates how to find the Bayes estimator, under squared error loss function, of the rate of a Poisson distribution if the prior distribution on the rate is a gamma distribution.
|
|
The first two criteria for evaluating point estimators are the bias of the estimator and the mean squared error of the estimator. The goal is to chose the estimator with the smallest mean squared…
|
|
In previous lessons, properties of samples and of statistics computed on a random sample were discussed under a very general framework; in other words, under the idea that a random sample is selected…
|
|
When a random sample is drawn, some summary of the values is usually computed. Any well-defined summary may be expressed mathematically as a function of an n-dimension vector. The domain of that…
|
|
We discuss three techniques for constructing families of distributions. The resulting families have ready physical interpretations that make them useful for modeling as well as convenient…
|
|
The beta distribution is the last continuous distribution we will discuss. Like the gamma distribution, the beta distribution earns its name from being associated with the beta function. The beta…
|
|
In this lesson, we learn about the most important distribution in statistics - the normal distribution. We investigate the probability density function and cumulative distribution function and the…
|
|
The gamma family of distributions is a very special family that has many distributions as a specific case. In this lesson, we begin with the gamma function. We then introduce the gamma…
|
|
An overview of the uniform distribution is given, including its probability density function (PDF) and cumulative distribution function (CDF). The mean, variance, and moment generating function are…
|
|
In this lesson, a brief review of the general properties of continuous distributions is provided. The end of the lesson is a comparison of the properties for continuous and discrete distributions.
|
|
The Poisson distribution is different from all of the discrete distributions we have considered up until this point. Instead of counting the number of "successes" or counting the number of…
|
|
The experiment that is used to illustrate when the hypergeometric distribution arises is that of an urn with N balls of two colors; M "white" balls and N - M "black" balls. The…
|
|
Like the binomial and hypergeometric distributions, the geometric distribution is related to the Bernoulli(p) distribution. Unlike the binomial and hypergeometric distributions, the geometric…
|
|
The binomial distribution is introduced through its unique connection to the Bernoulli distribution through what is called a Bernoulli process. The probability mass function is derived and proven to…
|
|
The Bernoulli and binomial distributions are used for modeling a dichotomous experiment or a sequence of independent dichotomous experiments, respectively. Alone, both distributions apply to a wide…
|
|
The discrete uniform distribution is one of the simplest discrete distributions. In this lesson, the distribution is defined via its probability mass function. The cumulative distribution function…
|
|
In this lesson, a brief review of the general properties of discrete distributions is provided.
|
|
In this lesson, moments, central moments, the mean and variance of a distribution are defined and illustrated.
|