Let X1, X2, ... , Xn be a random sample from N(μ, σ2), where the mean θ = μ is such that −[infinity] < θ < [infinity] and σ2 is a known positive number. Show that the maximum likelihood estimator for θ is θ^ = X.

Respuesta :

Answer:

[tex] l'(\theta) = \frac{1}{\sigma^2} \sum_{i=1}^n (X_i -\theta)[/tex]

And then the maximum occurs when [tex] l'(\theta) = 0[/tex], and that is only satisfied if and only if:

[tex] \hat \theta = \bar X[/tex]

Step-by-step explanation:

For this case we have a random sample [tex] X_1 ,X_2,...,X_n[/tex] where [tex]X_i \sim N(\mu=\theta, \sigma)[/tex] where [tex]\sigma[/tex] is fixed. And we want to show that the maximum likehood estimator for [tex]\theta = \bar X[/tex].

The first step is obtain the probability distribution function for the random variable X. For this case each [tex]X_i , i=1,...n[/tex] have the following density function:

[tex] f(x_i | \theta,\sigma^2) = \frac{1}{\sqrt{2\pi}\sigma} exp^{-\frac{(x-\theta)^2}{2\sigma^2}} , -\infty \leq x \leq \infty[/tex]

The likehood function is given by:

[tex] L(\theta) = \prod_{i=1}^n f(x_i)[/tex]

Assuming independence between the random sample, and replacing the density function we have this:

[tex] L(\theta) = (\frac{1}{\sqrt{2\pi \sigma^2}})^n exp (-\frac{1}{2\sigma^2} \sum_{i=1}^n (X_i-\theta)^2)[/tex]

Taking the natural log on btoh sides we got:

[tex] l(\theta) = -\frac{n}{2} ln(\sqrt{2\pi\sigma^2}) - \frac{1}{2\sigma^2} \sum_{i=1}^n (X_i -\theta)^2[/tex]

Now if we take the derivate respect [tex]\theta[/tex] we will see this:

[tex] l'(\theta) = \frac{1}{\sigma^2} \sum_{i=1}^n (X_i -\theta)[/tex]

And then the maximum occurs when [tex] l'(\theta) = 0[/tex], and that is only satisfied if and only if:

[tex] \hat \theta = \bar X[/tex]