I'll open my brain for you... My answer is no. I don't think that this country will truthfully move towards equal opportunity or equal treatment. Not anytime soon at least.
The country is run by white males, those who have businesses that influence the commerce that keeps the country rich. This will not topple easily. On paper and in law equal opportunity is the law. It is the way things should be. But not necessarily the way things are. Sadly so...
2006-10-01 07:18:11
·
answer #1
·
answered by elliott 4
·
4⤊
1⤋
It depends on one's view of power. The problem with American women, (I would know, I live in America) is they do not only want their rights, but they want men to give up what they have earned. It is true, America is known as the 'land of dreames', but these dreams must be worked for. A woman can't just stand in the street and yell, "I'm a liberated woman, give me riches!" Then the men will just yell back, "Then get a job!". lol
2006-10-01 14:40:03
·
answer #2
·
answered by woman_of_tomorrow 2
·
0⤊
3⤋
Overall, no, altho we all know that is a debateable question. My reason is this:
The women who do have as much power as men do, are usually those that have financial and social backing.
2006-10-01 15:48:43
·
answer #3
·
answered by whydiduaskthis? 3
·
3⤊
0⤋
In terms of money and political power, no. Men still get paid more, and elected more. Women do have more power in personal relationships, though. Men have a more frequent need to have sex than women do, so women use that to gain an advantage in personal relationships.
2006-10-01 14:08:51
·
answer #4
·
answered by martin h 6
·
4⤊
2⤋
No, land of the free HUH! America pretends that's its the free world, its not as long as you have religion in your politics! Remember woman are always downcast in religion so if its so deep rooted into the political system how can you ever think that woman will be on top?
Ireland, UK and most other Europeans countries have had woman presidents/prime ministers, why? Think about it no religion in there politics!
2006-10-01 14:21:37
·
answer #5
·
answered by Anonymous
·
3⤊
1⤋
The short answer without giving any type of long winded explanation is no
2006-10-01 14:49:18
·
answer #6
·
answered by Claire 5
·
3⤊
0⤋
Power? In what way. The only power comes from Christ. He is the power. If you mean equal rights and equal opportunity, yes. In theory we have equal rights. As with all systems it is not perfected.
2006-10-01 14:07:55
·
answer #7
·
answered by Shayna 6
·
4⤊
3⤋
no women do not,are they gaining ground yes. the problem is that we do just see people but we see what we want sex,race,religion whatever we want to look at. this year i am in state we have a female Governor and i will be voting for her, not because she is a women but i like what she has done .
2006-10-01 14:16:48
·
answer #8
·
answered by redtitan2001 2
·
3⤊
3⤋
Women definately have as much power as men in America. The only problem is they spend too much time whining about how much power they don't have instead of going out and actually using it. Sorry if that sounds shovanistic.
2006-10-01 14:13:35
·
answer #9
·
answered by spacecowboytim 2
·
0⤊
6⤋
No, and we won't until we are at least equally represented in Congress and there has been a woman president.
2006-10-01 14:59:43
·
answer #10
·
answered by MUD 5
·
2⤊
1⤋