English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

To me, as a white woman in the Midwest, I would say that America is just a really straight-forward country.

Maybe it's not so much "not getting" irony... as "not liking" irony.

America is a business culture. And in business, you want to be that straight-up kind of person that people can trust.

Ironic people are ironically considered "losers" because they aren't "on the team"... when the ironic people are wayyy past the team.

America is a "can-do" culture that doesn't like losers, whining or excuses. Cut to the chase. Do what you have to do. Focus, work hard, succeed. Get to the American dream.

Yeah, the American dream is very un-ironic.

Although, of course, America does have some very ironic comedy like The Onion... etc. So it's not completely one way or the other in American culture. I just think the majority culture is quite earnest.

*just my opinion*

2007-03-01 11:53:04 · answer #1 · answered by lexi m 6 · 1 1

Irony does not live in a fantasy world. Reality is not a place well known to a lot of Americans.

2007-03-01 19:39:49 · answer #2 · answered by Ashleigh 7 · 0 0

fedest.com, questions and answers