FIRST OF ALL, wait don't call me illegal or Mexican because I'm none of them. But I'm an exchange student and I'm learning about the history of the US. OK white people are not native to North America Nor South or Any part of the American continent. It was Britain, France, Spain, Portugal who brought them here. Then they learned from American Indians how to live here, how to survive.
After killing most of the natives, they had the land all for themselves, now a couple of hundreds later some say that "Mexicans" are invading their land, taking their jobs and stealing their money. EVEN FROM ANCIENT TIMES PEOPLE HAVE SAID "DON'T DO OTHERS WHAT YOU DON'T WANT TO BE DONE TO YOU" AND "WHAT GOES AROUND COMES AROUND". Loook history.
2007-03-20
18:26:59
·
23 answers
·
asked by
aj010101010101010101
1