It's a terminology issue. Or think of it as a nickname.
Most people also don't realize that it's the United States of Brazil. They just call it Brazil.
As long as the abbreviation is not ambiguous within the conversation, there is no need to always use the full formal name.
2006-08-14 14:37:38
·
answer #1
·
answered by coragryph 7
·
2⤊
0⤋
Americans know that North America is a continent and that we live in the United States of America. We also know that there is a Central America, and a South America. For some reason though, we are the only ones on any of these continents called Americans. And when anyone else in any country here's American, they think of us, and when they talk about us they call us Americans.
So is it irritating for other countries to find out that North America is a continent and not a country?
2006-08-14 21:42:16
·
answer #2
·
answered by mocha5isfree 4
·
0⤊
0⤋
The United States of America (the country) is on the North American continent.
2006-08-14 21:40:36
·
answer #3
·
answered by newt_peabody 5
·
1⤊
0⤋
Odd question- America IS a country (United States OF America); the continent is North America which is where the United States is located. So, I guess Americans truly aren't irritated at all.
2006-08-14 21:39:19
·
answer #4
·
answered by LV4RedSox 1
·
2⤊
0⤋
Not at all. Because when you say "Americans" are you including Mexicans and Canadians? Of course not. Just as when someone says "America" the are speaking of the United States. You can get all caught up in semantics, but when one says America, everyone knows what that means. Is it irritating to you to find out that the world revolves around "America"? And by America I mean The United States of America.
2006-08-14 21:42:07
·
answer #5
·
answered by Anonymous
·
0⤊
0⤋
those of us who are smart (or paid attention in geography) already know this. People use "America" as a shortened form for the whole name of the country, which is the UNITED STATES OF AMERICA. By the way, the name of the continent is North America...there's a South America too! Wow...geography.
2006-08-14 21:41:52
·
answer #6
·
answered by Jenn 3
·
0⤊
0⤋
No, most of us know that NORTH America is the continent, and the United States of America (USA) is the country. South Africa uses "Africa" in its name, and is also on the continent of Africa. It's just semantics. You can't believe everything you see on the BBC or CNN...most Americans I know are pretty nice people.
2006-08-14 21:43:01
·
answer #7
·
answered by cutiemamaof3 2
·
0⤊
0⤋
The "continents" are North America and South America. My country is America.
We are the worlds only superpower. We can call it whatever we want!
You don't have to like my country but when the world is a safer place because of us we do expect a "thank you".
2006-08-14 21:40:24
·
answer #8
·
answered by Nuke Lefties 4
·
1⤊
0⤋
No!
I can not speak as a "We" but" I" know North America, South America etc...etc...
I am an American from "The United States of America" and very proud of it!
2006-08-14 21:43:18
·
answer #9
·
answered by LN has3 zjc 4
·
1⤊
0⤋
No, you learn that in 2nd grade. Its a known fact that The United States of America is in North Ameria which includes Mexico and Canada.
2006-08-14 21:38:34
·
answer #10
·
answered by Me 6
·
1⤊
0⤋