Respuesta :

The United States became a colonial power at the end of the 19th century, after having spent the century moving across the North American continent to the Pacific Ocean. Defeat of Spain in the Spanish-American War led to the establishment of American colonies in the Caribbean and in the Philippines. The Hawaiian Islands were conquered at the sametime. (1890's)