Only if one depends on films for history instead of books.<quoted text>
No generalizations. Hollywood made us believe the Indians were the bad guys the entire time we were growing up.
Turns out, we were the bad guys. Nowadays, you'd think Hollyweird would be falling all over themselves to set the record straight just to promote their "blame America first" herd-like mindsets.
Only this time, they'd be telling the truth.
We broke promise after promise to the Indians and slaughtered many of their tribes.
And here I thought you were the self-proclaimed history expert all this time.
This is basic information - Native Americans v the U.S. Government 101, if you will.
Your History posts remain amusing.