World War 2 was the last big war for America, and it had many after effects towards our country. We entered the war after Pearl Harbor on December 8th, 1941, and ended it when Japan surrendered on August 14th, 1945. The time period may have been short, but our country was changed forever. Before the war we were struggling to get our of a depression that had carried on about ten years; after we were an economically strong country facing many new opportunities and struggles. We were now a huge world power, and though we may have been before WW2, we had finally acknowledged it. We were economically and technologically advance, and a free country was doing it. Yes, the war was successful for our country, and that is why it is called "The Good War." But did it bring us together as a nation, or did it divide us?.
I believe the war ultimately brought us together, but it also opened up many wounds further. My main point to show it was not good is through racism. In war one would hope that race would be put aside during such turbulent times for our country. We were fighting to "save the world" and "save democracy", but racism still ran rampant. Jim Crow laws were still in effect at this time, and segregation was everywhere. "Jim Crow" laws mandated that blacks have separate facilities for travel, lodging, eating and drinking, schooling, worship, housing, and other aspects of social and economic life. This carried into the armed forces. In 1941 the army and air force refused to train black officers and pilots. The navy would only employ blacks as kitchen staff. No African American women were employed at all. In March of 1942, the first black cadets finally received their wings. The Tuskegee Airmen saw their first combat in 1942. In 1944 two ships were launched, but all had black personnel. The armed forces remained segregated throughout the war, as blacks and whites could not fight along side each other.