I honestly don't know what to think. Some Americans don't care what's going on in politics, some people I know didn't know there was a war going on until there was a debate in one of my classes about it, and it seems like nobody CARES about what laws are passed or anything! The people that try to CHANGE things are given bad press, and are often discriminated against! Is this just the way the world works or is it just in America, "the land of the free and the home of the brave"?
2007-01-02
09:03:50
·
5 answers
·
asked by
Drop of Golden Sun
3
in
Politics & Government
➔ Politics