So those of you who think it's been as bad or worse before think it's ok to just continue on being this way forever more?
I'm not trying to offend you, just trying to understand the reason for bringing it up. I understand that you think that America isn't really doomed if it's been this way so long, but do you think it should remain this way or change? Would the change be better or worse for America?
PS. I don't mean "change" as in Obama's campaign dribble, I mean change as in having a country more united than dividied.
|