Under the Healthy Americans Act you're in charge of your health care - not your employer. If you lose your job change jobs or just can't find a job your health insurance is guaranteed to stick with you.

Random Quote

I don't think healthcare's a right. The only right you have is the ability to go out on an even playing field and work and then purchase health insurance or whatever it is.