Features include interactive map, in-depth stories, and more.Download now. »
The week's top five must-sees,
delivered to your inbox.
The Western United States, commonly referred to as the American West or simply the West," traditionally refers to the region comprising the westernmost states of the United States. Because the U.S. expanded westward after its founding, the meaning of the West has evolved over time. Prior to about 1800, the crest of the Appalachian Mountains was seen as the western frontier. Since then, the frontier moved further west and the Mississippi River was referenced as the easternmost possible boundary of the West. The West mostly comprises arid to semi-arid plateaus and plains and forested mountains. In the 21st century, the states which include the Rocky Mountains and the Great Basin to the West Coast are generally considered to comprise the American West. (via Freebase)