Nov 28, 2021
Hollywood has a liberal reputation, but I often wonder if that’s just a facade. Movies and TV series keep perpetuating some of the most conservative beliefs in our society, i.e. poor or working class characters pulling themselves up by their bootstraps and getting rich, gay and transgender characters turning out to be the villains, childfree female characters changing their minds and only finding real meaning in life through kids.