English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

did you ever felt that some movies are promoting bad ideas? or aiming to spread a certain thing? do you think these movies are doing good/bad to the world??? why????

different opinions are welcomed

2006-12-08 09:38:59 · 7 answers · asked by ahmed_mo2nis 4 in News & Events Media & Journalism

7 answers

Sure, hollywood puts unrealistic expectations on people. Beauty is judged by their standard. Romance is judged by their standard. It also gives a false idea about what America looks like (The OC) and that it's not hard here.

As far as promoting certain ideas, most of us are able to still think and discern if the 'idea' will impact us. If a movie maker wants to spread a "certain thing", so be it. We don't have to watch it.

Movies don't do bad to the world, people do bad to the world. If you see Thelma & Louise and you decide you want to kill yourself, that's not the movie's fault. If your kid watches MI 3 and tries to blow up the house, that's not Tom Cruise's fault. Why? Because we have personal responsibility.

But stupid people are stupid people, and movies won't change that.

2006-12-08 09:53:43 · answer #1 · answered by Anonymous · 0 0

I think both. Making movies is sorta an art-form; most art is made by creative people, artists, if you will. Artists are the people who have moved humanity forward throught history, providing us with ideas that normal people don't have. Hollywood has often been ahead of the curve, in terms of society's norms, making movies against racism before that was popular, as an example. So that's positive, I think.
Hollywood's negative impact has to do with shallowness & lameless. Look at reality TV, and how it makes people dumber. So, it's both!

2006-12-08 09:49:45 · answer #2 · answered by rustyreacharound 2 · 0 0

Unfortunately, Hollywood shapes public opinion. Movies have a huge impact on what we think, do and aspire to be. What most people fail to realize is that it's all fake. The stories are usually fiction, taking place in non-existent places with characters played by actors who usually can't walk and talk unless they're following a script!

2006-12-08 09:49:38 · answer #3 · answered by Anonymous · 1 0

destructive. Hollywood basically portrays "particularly people". If directors and writers tried to contain stable issues into video clips and television shows, then the end result will maximum def. be effective yet those circumstances replaced. the worldwide has grow to be desensitized to the easy pleasures of existence and that pisses me off! there is not any effective to Hollywood. each thing that will improve understanding or something like that are commonly non-earnings companies.

2016-10-05 01:41:49 · answer #4 · answered by ? 4 · 0 0

Hollywood shows America in the "heros with guns mode", where the good guys win cos they always far nastier and better armed than the bad guys. Your President does the same. Do you wonder my most of the rest of the world hates america?

2006-12-08 10:12:02 · answer #5 · answered by Old Cynic 3 · 0 0

Oh, do you think??? Instead of spending time paying attention to what Brittany and Paris are doing, think of all the things we could be doing instead.

2006-12-08 10:37:53 · answer #6 · answered by Jess 1 · 0 0

Their lifestyle and attitude leaves a lot to be desired.

2006-12-08 09:42:02 · answer #7 · answered by WC 7 · 1 0

fedest.com, questions and answers