You know, I was reading some articles about how Black people and so-called Black leaders are not supporting Obama. I don't really support him, but Blacks in general are overwhelmingly Democrats. I have the sneaking suspicion that the problem is this; if a Black president were to be elected, it would show how far this country has actually come, and that this would negate the victim status that Black people often use to excuse the myriad problems that plague us, as well as put leaders who constantly foster the victim mentality out of a job.
Discuss.