Not really.
After the war, many politicians in the North wanted to punish the South for war damages and secession in the first place.
This despite Lincoln's original plans to put the nation back together after the war.
The south never made much of a full economic recovery until the second world war, or maybe civil rights.
2007-10-20 09:12:16
·
answer #1
·
answered by Mark F 5
·
1⤊
0⤋
The North only bled everything they could out of the South. Remember, as part of ending the war, Lincoln ordered Sheridan to destroy the culture of the South on his march through Georgia. Civilians, women, children, and old people were targeted.
2007-10-20 12:23:27
·
answer #2
·
answered by Randy 7
·
0⤊
1⤋
About as much as we are helping Iraq back to prosperity. The carpetbaggers made all the money back then, today they are called Haliburton, et al. Oh your question! False
2007-10-20 09:03:39
·
answer #3
·
answered by postal p 7
·
2⤊
0⤋
False
the only thing the North ever did to the South was STEAL their land after the war......
2007-10-20 09:00:39
·
answer #4
·
answered by graciouswolfe 5
·
2⤊
1⤋
Well they didn't pay Confederate debts....they did send the Freedmen's Bureau...maybe that can be said to have helped in some aspects.
.
2007-10-20 09:06:17
·
answer #5
·
answered by ? 5
·
2⤊
0⤋