While the various European powers sought a “place in the sun” and established colonies around the world, the United States in the latter half of the 19th century expanded westward and tamed the interior of the North American continent. Continue reading