Wednesday, September 15, 2010

koh.01


The treatment of natives in the early 19th century really got me going. I'm still astonished as to how they were looked down as "savages" for leading a different way of life. Although, in retrospect, it makes sense; Americans still view the world as "us" against "them". Take a look at the "war on Islam". Book burnings? Really? What does burning the Koran do? It creates more animosity [from Muslims] towards America. Similarly, what did pushing the natives out of their land and attempting to 'assimilate' them into 'American culture' do? Created animosity towards Americans. I see a trend growing. 

America was founded on 'white' principles. 'Land ownership' is extraordinarily 'white'. Because the white settlers 'worked' on the land, they thought they had ownership over it, leading to the 'altercations' between the natives and the settlers. Just like present-day Europe, America is beginning to fear the de-whitening of America (or Western society as a whole). Marginalizing the minority is never a good maneuver (French riots 2000's) and it will only lead to more racial tensions and violence.

1 comment: