The West

The West

This documentary covers the history of the American West from the Native American tribes to their encounter with Europeans and how the Europeans conquered them and settled the land. In telling this story, the film takes into the account to both the viewpoints of Indians and other minorities to balance the white populations history.



0 trackers | Status: Ended | Airs on PBS |


  • Season 1