English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

5 answers

Yes, they can even stop all health insurance benefits if they want to.

2007-06-17 09:15:54 · answer #1 · answered by Anonymous · 0 0

Unless you have a contract with your employer to provide specific benefits, then they can end/change your benefits with little or no notice. Not a good business practice, but legal. (Ethics and law do not always play nice together).
However, they need to treat you the same as all other employees. They can not change your benefits if they do not change all "Like" employee benefits. Salaries vs. Non-Salaried, Managers vs. Employees. These distinctions may be enough to have differences between the two groups.

If your employer is playing games, sharpen your resume and hit the road. They may have financial issues (and this is one of the first signs), of they could just be &*(^%#@!'s. Either way, look out for yourself and move on if they are playing games.

Good Luck!

2007-06-17 23:54:41 · answer #2 · answered by JJ 5 · 0 0

Yes. Employers aren't required by law to provide you with ANY insurance benefits, except Workers Compensation.

If you don't like it, look for another job - that's the only reason employers OFFER health insurance - to attract a better worker.

2007-06-17 18:40:52 · answer #3 · answered by Anonymous 7 · 1 1

it happens all the time, usually on a yearly basis at renewal, they will either charge you more or offer less benefits, like a higher deductible, more copay, things like that

2007-06-17 14:44:50 · answer #4 · answered by swenjj 4 · 1 0

Of course they can. Be glad you still have something.

If you're really miffed about it, go find another job.

2007-06-18 09:23:35 · answer #5 · answered by aaron p 5 · 0 0

fedest.com, questions and answers