Yes, the USA moved west. They beloved it was there God riven right. The United States wanted western land. They got what they wished. They did so mainly through the Louisiana purchase and annexationist Texas and California! They moved all the way from the Atlantic to the pacific.