American frontier
(Redirected from American West)
The American frontier, in United States history, was the advancing border that marked those lands that had been settled by Europeans.
History
The second half of the 19th century and the early 20th century, from the 1850s to the 1910s, in the Western United States, is a period sometimes called the "Old West" or the "Wild West".
Some depictions exaggerated the anarchy and chaotic violence of the period for greater dramatic effect. This inspired the Western genre of film and fiction, which have influenced depictions in other media and elsewhere.
Woke politics
Older such depictions are now often considered politically incorrect, such as regarding Amerindians, Hispanics, and conflicts with Mexico.