English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

42 answers

...interesting...very interesting. YES!!

2006-07-20 05:52:29 · answer #1 · answered by MORENITA 4 · 0 2

No, without any intent to vilify the United States money is their prime interest. Before WWII, the US were an isolationist country which did not care much about the rest of the world let alone peace. During WWs they realised that armament was a lucrative buisness. Then it realised that re-building war-thorn countries was a lucrative buisness as well. So the US decided afterwards to devlop a democratic mission, rest assured democratic meant liberal economy. Turns out that unlike many political scientist tought, democracy is not a prerequisit to capitalist economy and it actually fares better under authoritarian governments. Over the years, the US helped terrorist associations and dictators take over countries that were on their way to democracy. These rulers were typically not interested in peace killing thousand of civilians and therefore not contributing to world peace. Far more money is involved in warmaking than in peace (especially when it's not on your territory) and until then war will be more interesting

2006-07-21 10:12:11 · answer #2 · answered by Prima Donna 2 · 0 0

I believe everyone would like to see the world at peace. Wars are costly, both in terms of human suffering and money. I think the problem in achieving it lies in the fact that no good deed goes unpunished. This may seem over-simplified but I think it is very true. When we as people go out into the world and try to do good for someone we end up hurting someone else. I think there is also too much history between all of us. So much evil has already taken place and too many cultures thirst for revenge for these past acts. I don't think there is a clear definition of what exactly peace is. Some would say that so called "evil" leaders like Hitler and Stalin actually created peace within their own nations. Missionaries thought that by going into the world and forcing Christianity onto others they were creating peace. George Bush probably believes in his heart that he doing his part to spread peace. Osama bin Laden believes the same. Each of them is probably right and the world is suffering for it. The problem we have in America is that we have it so good we think that everyone else in the world would be foolish to not want what he have and we go to great lengths to try to spread the "American dream."

2006-07-08 08:03:42 · answer #3 · answered by Anonymous · 0 0

A mere technicality, but ... America (North America, anyway), is made up of Canada, the U.S., and Mexico. If you mean "is the U.S. interested in world peace," that's a whole other question.

However, the answer is clear: most citizens are interested in world peace. Politicians are not. As long as there is religion in the world, there will be wars.

It's that simple.

It has been that way since the beginning of recorded history.

It will be that way until our planet explodes, or implodes, or is overrun by aliens from other planets (not brown-skinned people from Mexico looking for work to keep their families from starving!).

Nope. My answer isn't brief. Can't just say yes or no.

But if I DID just say yes or no??? The answer would be NO. As long as there is ORGANIZED religion in the world, there can never be world peace.

'nuff said.

2006-07-22 05:50:46 · answer #4 · answered by dragonheart 2 · 0 0

To answer your question, one first has to define "World Peace". This is a term that is very popular to throw about, but in truth, has not been something that has actually ever existed. In all of recorded history, there are no examples of any significant period of world wide peace. The sad truth is that conflict is inherent in human nature. Wherever you have people, you will find those that wish to dominate the weaker people around them.

That being said, I believe we are now in a period of time where it is fashionable to disguise your true motives under some noble sounding rhetoric. America is not alone in this, nor are any parties currently involved in conflicts immune to this. All sides seek to assign some higher moral purpose to what basically boils down to advancing their own agenda's and self interests.

America chooses to "promote freedom and democracy" and our opponents commit attrocities, which they then seek to blame on our actions, while in fact they are only attempting to assert their own control and simply resent the fact that we are more successful at it. Both sides of course, claim the moral high ground.

I personally would welcome a purge of all the rhetoric and claims to lofty goals and simply have all sides say, "we are exerting our will, because we have the means and it is in our best self interest". While conflict won't cease, at least we can dispose of the propanganda surrounding it. However, propaganda is as old as conflict, so I doubt we will get either of our wishes, be it world peace, or truthful motives.

2006-07-08 08:05:16 · answer #5 · answered by Paul F 1 · 0 0

To answer this question, one must simply point to the Iraq war. It has been thoroughly established that the reason for this war had nothing whatsoever to do with terrorism, wmd, promoting world peace, or any of the other reasons given that the many sheep in this country continue to believe. The fact is...the US has done more to promote terrorism, destroy peace, topple peaceful democracies (ie Haiti and attempts on Venezuela), oppress poor countries (which promotes terrorism), kill civilians of foreign countries, slander or imprison those that promote peace, destroy civil liberties here and abroad, and attack peaceful nations without provocation than any country since Nazi Germany. I'm talking about the leadership of this country, not the citizenship. The leadership desires PROFITS above all else, and destabilization and chaos offers the quickest and highest profits available (see Halliburton)

2006-07-22 05:28:04 · answer #6 · answered by corwynwulfhund 3 · 0 0

If you look at our history the answer is no. Our country has been involved in a armed conflict somewhere in the world in 9 of the 10 past decades. Now do the American people want peace? Yes. Our government and corporate leaders? No way. War = big money and power. Even though we say that we are "free" we really are not. I guarantee that we have more laws that govern what we can do and how we can live than you do.

So what do you think of us now?

2006-07-21 13:04:30 · answer #7 · answered by jim w 3 · 0 0

The sad reality to this is that there will always be nations that do not respond to anything but restrictions, embargoes, intimidation, and isolation. pending these fail, war may become inevitable. War however, is usually avoidable, especially when it is non strategically targeted at countries under false pretenses. Disputes can be resolved without carpet bombing and infantry mobilization. There is valid war, and then there is invalid war. And even throughout war, the intent to resolve conflict through civilized means should never evaporate. I believe that most if not all people in America prefer peace. I believe there is currently an insurmountable presence of rhetoric and misinformation that clouds the understanding of war, and that in itself can create support for war. But at the base of it all, its hard to believe there would be an American voice acting as a constant proponent for war, as it is the end of being civil.

2006-07-08 08:05:48 · answer #8 · answered by ? 1 · 0 0

not really. America is interested in furthering American interests. Just like every other nation. Support? Listen to any politician from the U.S. other than Bush. He might actually want peace, but since he's incompetent (and kinda dumb), he has no idea of how to acheive this.

On the middle east, our government actually does want peace there. A stable middle east means readily available cheap oil for everyone. And, trust me, noone loves cheap oil more than America! Well, I've gotta go fill my hummer up with 300 dollars worth of gas. have a nice day.

2006-07-21 19:41:01 · answer #9 · answered by Anonymous · 0 0

It is not Americans, but American politicians that are in favor of war. It is also not that other countries hate America but hate the political system that tries to force policies down their throats. Does the average American care about world peace? The truth is probably that they don't care and leave it to power-brokers in finance, and the White House and only moan when any hardship happens to their own family. Bush et al. do not go out of their way to attempt to stop problems, thinking mainly of profit motives, and the poor young men that get sucked into the 'patriotic spirit' are the victims.

2006-07-20 03:54:53 · answer #10 · answered by Frank 6 · 0 0

I don't think so... America or for that matter most of the nations are just interested in furthering their own interests and that should be the way as nothing matters most, esp. in diplomacy, more than getting what you want without giving away much.

Having said that, I should also say that US doesn't need to mouth platitudes or preach to the world or take a moral high ground regarding issues like democracy or all that kind of crap, as people around the world are not fools, and esp. with the advent of this greatest tool in the history of mankind -- Internet -- they have become more aware of whats going around as well as the historical actions of nations...and to tell you the truth, America is only interested in world peace as long as it doesn't harm its strategic or economic interests.

2006-07-08 07:53:43 · answer #11 · answered by Sh00nya 4 · 0 0

fedest.com, questions and answers