Not Available

The West

  • Documentary

This documentary covers the history of the American West from the Native American tribes to their encounter with Europeans and how the Europeans conquered them and settled the land. In telling this story, the film takes into the account to both the viewpoints of Indians and other minorities to balance the white populations history.

Cast

View Full Cast >

Images