The only way to fix the health care problems of this country WOULD be a government take-over! The health insurance companies should NOT be between you and your care. Why should they be in the middle? Why should they profit by getting in our way?
Those of you who think we do not need a new healthcare system are just fucking stupid!
BTW, here's how the insurance companies, that some of you want to fight for, are planning to handle mandatory healthcare for your children...
Major health insurance companies in California and other states have decided to stop selling policies for children rather than comply with a new federal healthcare law that bars them from rejecting youngsters with preexisting medical conditions.
Source:
http://www.latimes.com/health/la-fi-...0,799167.story