Why do aliens always want to come to earth and control us?
Of course I'm not talking about "E.T." or "Close Encounters of the Third Kind", but films such as "Independence Day" and "War of the Worlds".
Do we really have such a bad need to feel better than highly advanced extraterrestrial civilisations?
Independence day even puts a drunkard destroying an entire ship.
Isn't it time for a film in where aliens see that we can't stop fighting with each other and teach us something? It's always films like E.t. that stay in our minds. It's only those kind of films that fill us with something other than bloated up ego.
"Independence Day" just has a lot of explosions and make humans feel like the masters of the universe. And the u.s. president fighting against them? *looks at picture of Bush* ...............
Why aren't we, humans, for once, told that we still have a lot to learn and that we can't do many things right? Wouldn't that teach us something after we left the movie theatre?
2007-03-11
06:04:48
·
5 answers
·
asked by
Anonymous
in
Movies