if someone takes a flight from North Carolina to LAX about 2,780 miles away. if we assume that the plane will fly 575 mph, how long will his flight be?

Respuesta :

Answer:

4.83 hours

Explanation:

We can solve the problem by using the definition of average speed for a uniform motion:

[tex]v=\frac{d}{t}[/tex]

where

d is the distance covered

t is the time taken

v is the average speed

For the plane in this problem, we know:

d = 2780 mi is the distance

v = 575 mph is the speed

Solving for t, we find how much time the flight takes:

[tex]t=\frac{d}{v}=\frac{2780}{575}=4.83 h[/tex]