This is the BIG reason why corporate America has gone woke (plus 4 more)
A gender-neutral bathroom is seen at the University of California, Irvine © Reuters / Lucy Nichol

Why have the biggest and most profitable American corporations embraced leftist politics, as seen in their woke advertising and social justice activism? Hint: It’s not because they’ve become non-profits and taken up philanthropy.


summary via R3publicans: