English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

For as long as I can remember in television and movies, people from the southern part of the United States have been shown in a very negative light. They are portrayed as beer-guzzling, wife-beating, slow talking, imbred and ill educated people. Being from the south (Texas) I find it very degrading and insulting to see these over elaborated images and characters. I mean, my parents are not kin, I have never dated one of my relatives, I have all my teeth, wear shoes, do not have 8 dirty kids running around my trailer or know anyone by the name of Billy Bob or Emma Jean. Where did all of these myths about southern people originate?

2006-06-05 16:19:54 · 6 answers · asked by holyterrar85 4 in Society & Culture Cultures & Groups Other - Cultures & Groups

6 answers

I don't know where they originated, but I totally agree with you. However being from Texas, you have it easy compared to me. Just saying you're from KY is like asking for a Deliverance joke. I think people take our accents and just assume that someone who talks like we do could not have a brain in their head, at least one that functions. What's worse is that the type of people they are depicting us as can be found in every state across the US. We just happen to a nice Southern drawl and an ability to insult someone with a smile so that they never know it happened.

2006-06-05 16:29:49 · answer #1 · answered by lucygoon 4 · 2 0

I think it has something to do with the fact that Southerners kept slaves for two hundred years. The fact that they fought against the civil rights movement in the 1960s didn't help much.

It really is a shame, because a lot of progress has been made. I've lived all of my life in the North except for two years in North Carolina and four years in California. I found that there were a lot more idiots in California than in North Carolina.

2006-06-05 17:38:14 · answer #2 · answered by Ranto 7 · 1 0

It's true that most places in the South are nothing like the myths that are seen on TV, but some places are. If you go to some areas you still cannot date someone black if you are white, they keep guns to kill stuff for sport, they are very close-minded about "new" things and ideas, and do not see the point to a public education with different kinds of people. Some of the sterotypes come from real places, thoughts after the civil war about the South that were written in history books in the North (and they were very up-set), and lack of information.

2006-06-05 17:24:35 · answer #3 · answered by MindStorm 6 · 0 2

The stigma of the ignorant south stems from the time after the civil war when most southerners lost everything and has to live in burned out houses and former slave quarters . It seem that those who make movies are ignorant themselves , they portray us to be ignorant , poor and uneducated , but it sure doesn't stop them from buying homes here and moving here from the north in droves .

2006-06-05 17:16:52 · answer #4 · answered by Anonymous · 1 0

Because a bunch of "Damn Yankees" have written most of the scripts and they still try to lord it over the South because they won the War of Northern Aggression! They still think that southerners are a bunch of idiots. I wonder why so many of them go south in the winter?

Thank goodness for Mark Twain and William Faulkner who wrote well of the south.

2006-06-05 16:44:45 · answer #5 · answered by pinelake302 6 · 2 0

Because the people who make the television shows and movies are all up north or out west. Which means they think they know everything. Irks me too!

2006-06-05 16:30:02 · answer #6 · answered by tooyoung2bagrannybabe 7 · 2 0

fedest.com, questions and answers