Radio signals travel at a rate of 3 × 108 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 × 107 meters?

Respuesta :

[tex]\dfrac{3.6\cdot 10^{7}\,m}{3\cdot 10^{8}\,m/s}=0.12\,s[/tex]

It takes 0.12 seconds for the radio signal to reach the surface from the satellite.

Answer:

The correct answer is 0.12 seconds.

Step-by-step explanation:

To solve this problem, you can to use the rule of three.

3X10⁸ meters → 1 sec.

3.6x10⁷ meters → x sec.

Now, you have to multiply 3.6x10⁷ meters by 1 sec, then divide by 3x10⁸ meters:

[tex]x = \frac{3.6\cdot10^7 m * 1s }{3\cdot10^8m}[/tex]  

// m divided by m is 1, rewrite the numbers.

[tex]x = \frac{36000000s}{300000000}[/tex]

//cancel the zeros.

[tex]x = \frac{36s }{300} = 0.12s[/tex]