English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-08-01 11:26:51 · 25 answers · asked by Anonymous in Politics & Government Politics

25 answers

Lord I hope not, because the Democrat's ideal of "regaining legitimacy" is appeasing our enemies.

2007-08-01 11:30:04 · answer #1 · answered by ks 5 · 5 7

I do not think so because the US has been losing legitimacy for years even during the Clinton years. I think the best way to regain legitimacy is to more actively engage the world in foreign affairs instead of appearing to exclusively make decisions. I am not saying we should let the world make every decision for us, but we have failed to engage in a sensible debate with the rest of the world on a lot of issues. Our ideals are not self-evident as Jefferson said, so we need to articulate them in an effective manner. Anyone who can do this will regain some global legitimacy.

2007-08-01 18:31:02 · answer #2 · answered by The Stylish One 7 · 1 4

Absufreakinglutely NOT. Democrats tend to look inward instead balancing their focus between domestic and foreign policy. Also, if a democrat gets elected it will be through campaigning on "bring the troops home" or some such slogan. They will have to follow through after they are elected. This will lead to troop withdrawal from Iraq. If we leave Iraq the way it is, whether you agree with us being there or not, we will look to the world like weak quitters.

2007-08-01 18:52:30 · answer #3 · answered by Joru 2 · 0 1

I'm not saying the latest Republican has been perfect by any stretch, but if a Democrat gets elected, the country will be too busy trying to save face to worry about legitimacy. Higher taxes, a health care system more screwed up than the one we have now and a world view that we are willing to roll over and be defeated are what we have to look forward to with a Democrat in the White House.

2007-08-01 18:31:55 · answer #4 · answered by Anonymous · 4 5

i dont think the US will regain legitimacy until the American people start playing a part in government and voting...or better yet getting involved in anyway possible

2007-08-01 18:34:00 · answer #5 · answered by afrothunder_229 2 · 4 1

That is what many are hoping I think. However, I'm not too happy with the Dems in congress not going along with the republicans idea of changing the mission in Iraq. I think they have a great idea there and one that might just work to get the troops out of Iraq sooner. I'm very disappointed with that and their views on immigration. Something is still not ringing true.

2007-08-01 18:36:06 · answer #6 · answered by BekindtoAnimals22 7 · 1 2

i dont like ur choice of words but when president clinton was in office the us dollar on the top of the currency market and the national surplus was in access of 2 trillion dollars since the reign of bush the yen overtook the dollar and we are operating on a deficit the likes of which this country has never seen so will it be better if we elect a democrat yes but this country will always be a legitimate one and the greatest one on earth

2007-08-01 18:33:43 · answer #7 · answered by jim_beam3001 3 · 4 1

Political officials have yet to change the legitimacy in the US.

The key word here is "political"

That in itself is an oxymoron.

2007-08-01 18:44:47 · answer #8 · answered by Anonymous · 0 1

" Legitimacy” hasn’t been a question in over 230 years… however, “Integrity” under the Bush Administration is.

2007-08-01 18:43:15 · answer #9 · answered by Anonymous · 2 1

In your opinion? No. It is the opionion of a far greater majority that our ''Legitimacy'' is not in question.

2007-08-01 18:31:07 · answer #10 · answered by Anonymous · 3 4

fedest.com, questions and answers