I know that a lot of Americans are against the Iraq war. And a lot of Americans think it was a mistake that the U.S. invaded Iraq to begin with. It seems like some people on the far-left in America want to see the U.S. lose in Iraq, so it will be easier for Democrats to win elections. I know that I don't want to see the United States lose another war. If the U.S. left Iraq with a stable government that was pro-Western, & was willing to help us fight Islamic terrorists then I would consider that a victory. I just think it's wrong that there are some Americans that want to see their own country lose a war, so they will gain politically.
2007-10-19
08:28:26
·
33 answers
·
asked by
Anonymous
in
Politics