Respuesta :
Answer:
the difference between the estimator and the population parameter grows smaller as the sample size grows larger.
Step-by-step explanation:
In Statistics, an estimator is a statistical value or quantity, which is used to estimate a parameter.
Generally, parameters are the determinants of the probability distribution. Thus, to determine a normal distribution we would use the parameters, mean and variance of the population.
An estimator is said to be consistent if the difference between the estimator and the population parameter grows smaller as the sample size grows larger. This simply means that, for an estimator to be consistent it must have both a small bias and small variance.
Also, note that the bias of an estimator (b) that estimates a parameter (p) is given by; [tex]E(b) - p[\tex]
Hence, an unbiased estimator is an estimator that has an expected value that is equal to the parameter i.e the value of its bias is equal to zero (0).
A sample variance is an unbiased estimator of the population variance while the sample mean is an unbiased estimator of the population mean.
Generally, a consistent estimator in statistics is one which gives values that are close enough to the exact value in a population.