contestada

A man is standing on the edge of a 20.0 m high cliff. He throws a rock horizontally with an initial velocity of 10.0 m/s.
a. How long does it take to reach the ground?
b. How far does the rock land from the base of the cliff?

Respuesta :

Answer:

a. t = 2.02 s

b. d = 20.2 m

Explanation:

Horizontal Motion

If an object is thrown horizontally from a height h with a speed v, it describes a curved path ruled exclusively by gravity until it eventually hits the ground.

The time the object takes to hit the ground can be calculated as follows:

[tex]\displaystyle t=\sqrt{\frac{2h}{g}}[/tex]

The time does not depend on the initial speed.

The range or maximum horizontal distance traveled by the object can be calculated by the equation:

[tex]\displaystyle d=v.t[/tex]

The man standing on the edge of the h=20 m cliff throws a rock with an initial horizontal speed of v=10 m/s.

a.

The time taken by the rock to reach the ground is:

[tex]\displaystyle t=\sqrt{\frac{2*20}{9.8}}[/tex]

[tex]\displaystyle t=\sqrt{4.0816}[/tex]

t = 2.02 s

b.

The range is:

[tex]\displaystyle d=10\cdot 2.02[/tex]

d = 20.2 m