English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2 answers

no

do you shop at Wal-Mart?
know anyone who does?

why do they shop there?

low prices, right?

if Hillary or someone uses the government to force Wal-Mart to provide 'health insurance', won't Wal-Mart raise their prices? Of course they will.

And so will every other company that doesn't already provide health insurance to every employee.


Can you hear the jobs fleeing the United States for countries where health insurance isn't required if you do this?

Yup -- as many jobs as possible would leave.

And who would be most likely to lose their job to foreigners -- rich people or poor people??

ah -- poor people. Of course.


And poor people do not need any more competition for their jobs from Mexico or China -- they already have enough.

So, I conclude that companies should NOT be forced to provide health insurance to their employees.

They'll do it anyway if and when it makes sense in terms of hiring and keeping good employees who they need here in America.

2007-05-31 10:02:04 · answer #1 · answered by Spock (rhp) 7 · 0 0

No. Employing people in the US is expensive enough as it is. If you require corporations to provide health insurance, you'll drive more jobs overseas.

MOST people with skills have no problems getting jobs with employers that provide health insurance. It's UNSKILLED labor that has a problem.

The OBVIOUS answer here, is to improve your skills and make yourself more marketable. As a bonus, not only will you get health benefits, but you'll make more money.

2007-05-31 11:30:27 · answer #2 · answered by Anonymous 7 · 0 0

fedest.com, questions and answers