Political movies are not a hit with public
Thursday, November 1, 2007
Hollywood - the land of loonies and liberals - has always taken left-wing positions in the world of politics. Those ultra-rich elite so enjoy providing entertainment that bashes other ultra-rich elite because their political positions are on the other side of the spectrum.