Explain why if a runner completes a 6.26.2​-mi race in 3434 ​min, then he must have been running at exactly 1010 ​mi/hr at least twice in the race. Assume the​ runner's speed at the finish line is zero.

Respuesta :

Answer:

All speed from 0 mi/h and 11 mi/h were reached because the initial and final speed was 0 mi/h the speed of 10 mi/hr was reached at least twice in the race.

Step-by-step explanation:

Given,

Total distance covered = 6.2 miles,

Time taken = 34 minutes = [tex]\frac{34}{60}[/tex] hours

( 1 hour = 60 minutes )

Since,

[tex]Speed = \frac{Distance}{Time}[/tex]

Thus, the speed of the runner = [tex]\frac{6.2}{\frac{34}{60}}[/tex]

[tex]=\frac{6.2\times 60}{34}[/tex]

[tex]=\frac{372}{34}[/tex]

= 10.9411764706

≈ 11 miles per hour

Thus, the average speed of the runner is 11 miles per hour ( approx )

By MVT,

The speed was exactly 10 mi/hr at least twice in the race.

By intermediate value theorem,

All speed from 0 mi/h and 11 mi/h were reached because the initial and final speed was 0 mi/h the speed of 10 mi/hr was reached at least twice in the race.

The average speed of the runner is his total distance covered divided by the time taken.

See below for proof

The given parameters are:

[tex]\mathbf{Distance = 6.2\ miles}[/tex]

[tex]\mathbf{Time = 34\ minutes}[/tex]

The average speed is:

[tex]\mathbf{Speed = \frac{Distance}{Time}}[/tex]

So, we have:

[tex]\mathbf{Speed = \frac{6.2\ miles}{34\ minutes}}[/tex]

Express time as hours

[tex]\mathbf{Speed = \frac{6.2\ miles}{34/60\ hour}}[/tex]

This gives

[tex]\mathbf{Speed = \frac{6.2 \times 60\ miles}{34\ hour}}[/tex]

[tex]\mathbf{Speed = \frac{372\ miles}{34\ hour}}[/tex]

[tex]\mathbf{Speed = 10.87\ mile/hour}[/tex]

The average speed is 10.87 miles per hour.

10.87 is greater than 10.

This means that the runner runs at a speed less than 10.87 (e.g. 10 miles/ hour) several times before getting to the finish line

Read more about speeds at:

https://brainly.com/question/14472132