i think the US must be the only place where being "liberal" (which means in my understanding something like "live and let live") is considered a bad thing. whereas the US was founded especially on these values - or did i get that wrong? please someone enlighten my stupid euro ass. thanks
|