English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

7 answers

If you mean the reconstruction period following the civil war, then the answer is yes, but it has been a very slow process. We are not finished even today. There are still some very backward thinking people who see others as less than themselves due only to color. It did, however, put black American on the map as being free and "equal", although true equality has come in phases, and is not quite here yet. A lot of the reconstruction of the South was simply to punish them, and to recover costs for the war. The south was reduced to a poor area, and much of it still remains that way today.

2006-09-07 15:49:41 · answer #1 · answered by Jamie 5 · 2 0

Actually, the civil war reconstruction caused a widening in the gulf between the north and the south after the war. The policies implemented were in part responsible for the rise of the KKK and a few other "white" organizations. White southern planters were concerned that blacks would wield too much power over them and they'd lose everything, and the carpet baggers and republicans did nothing to ease their concerns. I've heard the term "the second civil war" to the reconstruction.

2006-09-07 15:52:16 · answer #2 · answered by cinquefoil_solis 3 · 0 0

Eventually, but it took an awful long time. I live in Arlington, Virginia, where Robert E. Lee lived before the war. After the South lost the war, law and order almost completely broke down here and it stayed that way for about 40 years. There was a very small population at the time, but they managed to support an amazing number of bars and casinos. There was a special place for dueling, and the place where the Pentagon is now (It's not in Washington.) was called Hell's Bottom. It was not unusual to find a body hanging from a tree there. The place has changed plenty. Now it's a wealthy suburb of Washington, DC.

2006-09-07 16:02:51 · answer #3 · answered by Anonymous · 0 0

Economically, yes. But right after the war, the southern states still hated the north. And in some ways, they still do. I grew up in the north and I married a southerner. His family was very upset that he was marrying a Yankee. I hadn't heard that term outside of a history book. But in some places of the south, they still are angry about losing the war... even though they profess to be solid Americans.

2006-09-07 16:00:38 · answer #4 · answered by Ann Toozie 6 · 0 0

Reconstruction became into no longer a fulfillment interior the U. S. after the conflict. It became into ten years of failed classes and policies which held lower back the genuine healing of this united states of america an excellent hundred years. Prosperity happened with the aid of massive length of our united states of america and because all of us are human beings with a similar desire to be triumphant. Our components are great and we be able to make it so.

2016-12-12 04:32:58 · answer #5 · answered by Anonymous · 0 0

absolutely not-the north and the south are still divided...call it stereotypical at this point. the north tries to prove how much better they are than the south and vice versa...although in reality the south seems to still be fighting the civil war!!!!!!!

2006-09-07 15:52:42 · answer #6 · answered by Ally 2 · 0 0

just look at Atlanta Ga. today...brand new-clean city---yes i think so...

2006-09-07 15:49:40 · answer #7 · answered by navigator j 2 · 0 0

fedest.com, questions and answers