English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

12 answers

No. It has always been barbaric. This isn't something new.

2007-12-20 10:34:15 · answer #1 · answered by Anonymous · 0 0

America has no specific religion.
People in America practice whatever religion they are drawn to.

The only barbaric religion I see practiced is Islam as it is practiced all over the world- except in America.

2007-12-20 11:02:14 · answer #2 · answered by tnfarmgirl 6 · 4 1

If you're really so desperate for attention, why not just walk out on the street and start hurling racial epithets...? Or is your keyboard your only source of courage...?

2007-12-20 11:05:18 · answer #3 · answered by u_bin_called 7 · 1 0

They believe in an invisible man in the sky, but they don't believe in science and global warming. Sounds insane to me.

2007-12-20 11:05:08 · answer #4 · answered by Anonymous · 0 1

Yes...and no. Its only the fundies that make the most noise. As a moderate Christian, I don't believe in pushing my faith down people's throats...and I certainly don't buy into the war on Christians

2007-12-20 11:01:16 · answer #5 · answered by Anonymous · 2 1

Barbaric in what way. Your question is too general be more specific

2007-12-20 10:59:16 · answer #6 · answered by Ernie M 2 · 1 2

Really we have a national religion? Wow I must have slept through that one!

2007-12-20 11:00:06 · answer #7 · answered by Anonymous · 4 1

The U.S. does NOT have a "National Religion"!

2007-12-20 11:04:37 · answer #8 · answered by Guessses, A.R.T. 6 · 1 1

If that's true there are a hell of a lot of insane Mexicans...

2007-12-20 10:58:20 · answer #9 · answered by Jennifer H 4 · 0 1

Define "American religion"-that could be anything.

2007-12-20 11:02:47 · answer #10 · answered by Big Bear 7 · 2 1

fedest.com, questions and answers