From the aftermath of the Civil War through the Spanish-American War and World War I, the United States had a prominent place on the international stage of politics between 1865 and 1920. This essay will give an overview of those years and some of the roles the United States played in the international political scene.
AFTER THE CIVIL WAR.
There were tremendous political, economic, social, and legal changes that occurred between the years of 1865 and 1877. These changes, including the Reconstruction, began around the time of the Civil War and only continued to escalate after it had ended. Indeed, there were quite a few conflicts that came out of -- as well as were solved by -- the war, but some of the most instrumental had to do with how blacks were treated by the whites.
As the war raged on, black cotton farmers were looking forward to a Northern victory, which would ultimately give them their freedom; however, if the South were to win, those in the Confederate states would succumb to the ongoing imprisonment of slavery. It appeared as though the white man did not want to part with all the money the black farmers were making for them, which is why they fought so hard to make sure slavery kept on going. After the North's victory, black cotton farmers finally received the freedom they so desperately wanted.
When the Reconstruction Period arrived, it looked as though blacks were going to gain even more rights alongside the emancipation that had already taken place; however, it actually proved to represent a time "of much disappointment" (Kirkendale 2002, PG) for many. It was not that laws were not in place as a means by which to protect blacks from the injustices once faced, but those laws were often ignored by the whites. The provisions, as they called them, were meant to provide blacks with the ability to perform the same as others, without the restrictions of being slaves.