Nineteen forty-five set the end of World War II. Hitler had been defeated indefinitely. Nonetheless, the war had left its mark on numerous countries, especially the USA. However, not all of the changes were bad, some were good, but they were changes that affected America in terms of social, economic and political aspects. American society was greatly affected by the war mainly involving whites, as well as the battle between women and men. WWII initiated the migration of a truly vast amount of American people. Numerous workers started to move to their industries in which they worked for the military producing arms as well as other accessories pertinent to military weapons. All this abrupt change started massive social dilemmas. Divorces skyrocketed while the schools tried their best to handle the massive overload of children.
During the war, there was a change in the role of women in society. Working women became much more common during wartime. Jobs that typically one would think a man would do soon became taken by women. Some women even served in the military. This huge change marked a different mentality towards working women. The war not only affected whites, but also affected black people in America as well. In that time blacks were still separated from whites within armed forces. They were made to do service and construction tasks, and there was still a huge amount of racism and separation between blacks and whites. In turn, black people had to push much harder for their rights than whites did because they were a minority. It was somewhat ironic that they went overseas to overtake the Nazi dictatorship, only to return to a place where they were not treated equally with other civilians. Towards the end of the war, it did become slightly better for the black community, but it was not extremely significant. WWII could be a reminder for blacks of the inequality that was ubiquitous throughout America.