Has anyone else noticed the change in America?
It used to be said that the liberals hated America.
Anyone else notice that it is now the right that hates America? Talks of secession, not wanting to contribute their fair share to the Treasury, now against "socialism" (even though every government service is, by definition, socialism).
The right always cites the constitution, which starts with three simple words, "We the People." If they had it their way, "the people" would only be those that are registered Republican. Instead, they'd just like to pick out the bits and pieces that support their arguments.
I could go on, but for the people that really need to get the message, it will fall on deaf ears.
Love your country or get the hell out. United we stand, divided we fall. You sure were all about chanting that when we invaded Iraq...
