Will ‘Woke-ism’ Be the Death of Hollywood?

Some people believe inclusivity will be the death of Hollywood, and fingers are pointing to the white males who feel pushed out as women and people of color are being sought after to fill powerful positions.

Around The Web

More in Real Talk

Real Moments