Quote:
Originally Posted by MaDalton
i think the US must be the only place where being "liberal" (which means in my understanding something like "live and let live") is considered a bad thing. whereas the US was founded especially on these values - or did i get that wrong? please someone enlighten my stupid euro ass. thanks
|
Yeah, this is the only place I know of where liberal is used as a pejorative.
However, recent elections have shown that the majority of the country don't have a problem with "liberals". The right wing had just managed to delude itself into thinking that the majority of the country was "conservative" because they'd won a few elections. Turns out that wasn't the case.