The length of needles produced by a machine has standard deviation of 0.04 inches. Assuming that the distribution is normal, how large a sample is needed to determine with a precision of ±0.005 inches the mean length of the produced needles to 98% confidence?

Respuesta :

Answer:

The sample size is  [tex]n = 87[/tex]

Step-by-step explanation:

From the question we are told that

      The standard deviation is  [tex]\sigma = 0.04 \ inches[/tex]

       The  precision is [tex]d = \pm 0.005 \ inches[/tex]

        The confidence level is [tex]C =[/tex]98%

Generally the sample size is mathematically represented as  

        [tex]n = \frac{ Z_{\frac{\alpha }{2} } ^2* \alpha^2 }{d^2}[/tex]

Where  [tex]\alpha[/tex]  is the level of significance which is mathematically evaluated as

        [tex]\alpha = 100 - 98[/tex]

        [tex]\alpha = 2[/tex]%

        [tex]\alpha = 0.02[/tex]

and  [tex]Z_{\frac{\alpha }{2} }[/tex] is the critical value of [tex]\alpha[/tex] which is obtained from the normal distribution table as  2.326

  substituting values

         [tex]n = \frac{2.326 ^2* 0.02^2 }{0.005^2}[/tex]

        [tex]n = 87[/tex]