English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

4 answers

Yes, mostly men were sent over to colonize the new world. Many years later women were sent, but no where near the same amount of men that were sent over.

2006-08-13 12:30:40 · answer #1 · answered by Justin 3 · 0 0

Which colonial society? The society that became the United States?

Any society is dominated by men. Women were to busy having kids, dying in childbirth, worrying that their children would die from any of a number of fatal childhood diseases, plague, malnutrition, and other forms of suffering like domestic abuse that few (there are notable exceptions) would ever have the time or energy to be involved in societal change.

2006-08-13 19:32:50 · answer #2 · answered by want it bad 5 · 0 0

In Colonial society, only men could vote. Not all men, because most places restricted it to "White" land owners only. Women were not given that right for many many years. So if dominated might mean decision making? If so, YES

2006-08-13 19:34:04 · answer #3 · answered by bamerson1 2 · 0 0

Yes, in a way you could say that. However, women did important things, too. It's just isn't written about as much as what white men did. ( No offense meant ). After all, history is written by the winner.

2006-08-13 19:52:21 · answer #4 · answered by Anonymous · 0 0

fedest.com, questions and answers