I'm just wondering, because lot's of Canadians always say that Americans don't learn anything about Canada in the schools, and people here in Canada, say Americans think we live in igloos. They say in the schools here we are more educated and we learn about the USA in the schools here.
I personally don't see why Americans need to learn about Canada and vice versa, but the people here ***** about it all time. I'm in grade 10, in a town outside Toronto and I still haven't learned anything about the US in any of my classes, so I don't get why everyone here is saying that. It's really annoying, because I've been to the US and I have relatives in Atlanta and there really smart and they own like a business and stuff.
2006-11-20
15:46:27
·
23 answers
·
asked by
Mark
1
in
Other - United States