PLEASE HELP! I WILL GIVE 50 POINTS!!! A radar gun measured the speed of a baseball at 92 miles per hour. If the baseball was actually going 90.3 miles per hour, what was the percent error in this measurement?

Respuesta :

Answer:

Step-by-step explanation:

A radar gun measured the speed of a baseball at 103 miles per hour. If the baseball was actually going 102.8 miles per hour.

1.883% error

[tex] | \frac{actual - guess}{actual} | \\ | \frac{90.3 - 92}{90.3} | = \frac{1.7}{90.3} = .0188 = .0188 \times 100\% = 1.88\%[/tex]