This is a tricky subject since the most powerful nations on Earth gained power through colonization of other areas of the World. So from the perspective of the colonizing nation, it was good.
From the perspective of the colonized area, or people, or nation, it's a mixed bag. Generally not good because the people were exploited to generate wealth for the colonizing nation.
The cultures of the colonized peoples were altered or lost by the strong influence of the colonizing nations.
In some cases, like the West Coast of the US, colonization by the Spanish benefited the indigenous peoples at first, and they welcomed the influx lead by the Mission Movement.
Over time, however, the European immigrants overwhelmed the indigenous population and the culture was subsumed into what is now the US culture of California.
2006-08-30 05:22:52
·
answer #1
·
answered by odu83 7
·
0⤊
0⤋
Colonization is generally associated with imperialism. For example, India during the 1800s was under English rule for some time, and all of the major governmental decisions were made by the English. When power is centrally held by a homogenous group, all other groups are subjugated as a general rule.
2006-08-30 12:23:03
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
It could be seen as a bad thing (usually by hippie liberal teachers) if you consider the suppression of native traditions and customs.
For example, Spanish conquistadores burned whole libraries of Mayan literature because they thought it was pagan. Sadly, we'll never know what was in those books because of the actions of those particular colonists.
And American colonists did grow to treat the Indians pretty poorly.
2006-08-30 15:37:06
·
answer #3
·
answered by Lawn Jockey 4
·
0⤊
0⤋