English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

The wars helped to bring about important changes in the British colonies. In addition to the fact of their ocean-wide distance from the mother country, the colonies felt themselves less dependent militarily on the British by the end of the wars; they became most concerned with their own problems and put greater value on their own institutions. In other words, they began to think of themselves as American rather than British. Quoted from yahoo.

2006-11-20 09:39:43 · answer #1 · answered by Joe Schmo from Kokomo 6 · 0 0

LAND

2006-11-20 17:39:13 · answer #2 · answered by sam w 1 · 0 0

fedest.com, questions and answers