Type a new keyword(s) and press Enter to search

Racism

 

            Throughout the history of the country, America has been considered a fairly racist union. Undoubtedly the greatest injustice in the United States to this day is the white's treatment of African-Americans, specifically slavery. The vast majority of non-black people of that time believed that blacks were not equal to other races. White Americans of the slavery period specifically held this view. It was nearly impossible for a black to live free in America, and it was even more difficult for a black to find a job. As time passed, however, many people began to change their views on race relations in America. After slavery was abolished, fewer and fewer people believed that they were supreme over the African-American race. Not only were blacks free, they were becoming accepted as people in our society. They were even becoming accepted in the workplace. Many employers were no longer bothered by giving a job to an African-American. America seemed to finally be turning around for the better. After all, African-Americans only asked for equality, and they were getting closer and closer to that goal with each passingday. Suddenly, however, some people began to lose sight of the mission they set out to accomplish. Instead of just trying to make America equal, they felt that they should attempt to make up for the times when it was not. Many places in society, such as the workplace, the court system, and the entertainment industry, seem to have shifted to being easier for blacks to advance themselves in than whites. The intent, to improve race relations in America, was good. The problem that comes about, however, is that it begins to enter people's minds that it is better to be black than it is to be white. African-Americans should certainly feel confident with themselves, but they should not be given a reason to feel superior, either. Nobody should be able to feel superior to another person simply because of race.


Essays Related to Racism