The War Between the States, or better known as the Civil War, was a time when the southern states wanted to secede from The United States. After the war came to end, America, as we know it, began to change. I strongly believe that actions taken after the Civil War were primarily beneficial for American society. The changes led the United States in being actually “united” once and for all. From approximately 1865 to 1910, changes in American society such as the time of reconstruction, immigration…
Words 758 - Pages 4