Only an idiot would not know the valid explanation for this...
Why is it called racism when White Americans talk about Black Americans and not racism when Black Americans talk about white?
Our society dictates living with two sides to life, there's a thing called public and there's a thing called private and why that is, you only know. But what is done in the dark will soon come to light, like fire under dry leaves.
Because Black America ain’t never done nothing wrong to White, but not the other way around. White Americans have publicly and privately denounced Black Americans. When all Black Americans have ever done to White Americans is Help them and that is what so many of us was brought here to do. In return for this noble act, White Americans have only taken the advantage of Black Americans, enslaving them and doing what they could to control Black Americans and to destroy Black men.
White Americans still expect Black Americans to do everything for free - Head, Heart, Hands and Health. And then there are those White Americans who stand on the White side lines while all these atrocities are being committed, in the name of Justice; and then they want to kill people like the Bin-Laden's.
Even today, White Americans expect Black Americans to be hired at much cheaper rates than they would pay themselves. In America, there’s a White man’s price and a Black man’s price on everything. It actually cost Black Americans considerably more to live in a land which historically, was built on Black American slave labor. How is this so and how should Black Americans or anybody else who served as they served in America, be treated? Where is the dignity and respect?
Historically, White Americans have ignored the feelings of Black Americans just because of the Trans-Atlantic Slave Trade.
Historically, White Americans have treated themselves Superior, to especially Black Americans. By treating themselves superior, hence treating Black Americans inferior.
Historically, this has been the trade mark of Whites in America, their treatment of any people who helped them and in this case, my proof is Native Americans. Historically, all Black Americans ever heard from White Americans were words and deeds of inferiority. And even though White Americans committed such terrorizing acts against Black Americans, Black Americans were expected to be muzzled, never fight back or win. It was the law that it was against the law to help Black Americans in America and in particular, the Black Man, no matter how inhumane this treatment may have been.
Today, Black Americans as a whole, in particular more Black American Men, are standing up, fighting back, telling you all about your "bullshit" and you have the audacity to question that. We're being recognized as the men we are, rather than and instead of the men you think we ought to be.
All white America have ever done for Black America is "Bullshit" and you think I ought to be "thankful" for that or choose to live else where. That's just how lowdown and treacherous White Americans are and they need to know we know them better than they know themselves. If I am to believe them, why aren't they to believe me?
Challenging not just to White America but to the World. If you help save one person, one human being then you have helped saved the world. The truth is like hiding fire under dry leaves.
Do you think White Americans are still superior? Do you believe White Americans are still superior to Black Americans? Or do you think all of this is just in the mind?
(((your inner voice.com)))