Sometimes Hollywood is Right, not Left

It is a truth universally acknowledged that Hollywood and the entire U.S. entertainment industry teems with people who are described in the U.S. political vernacular as “liberals.” But, see, it’s not a truth. It’s simply a lazy assumption. Many people in


Read More

  • Digg
  • Del.icio.us
  • StumbleUpon
  • Reddit
  • RSS

0 comments:

Post a Comment