English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

this is for an essay i have to write to get a schloarship..please help

2007-05-12 03:18:27 · 9 answers · asked by Anonymous in Business & Finance Insurance

9 answers

NO. The less governmental involvement there is in private businesses, the better off everyone as a whole, is. The more it costs to employ one person, either the higher the goods are, or the fewer the employees. Employers are NOT an unlimited source of funds.

The free market economy of the US is what has made this country the wealthiest - not just the rich, but the middle class and the poor. This country has the "richest" poor people IN THE WORLD. Whenever the government highly regulates and controls private businesses (which, by the way, is exactly what FASCISM is), it makes the businesses go broke, which means . . . NO jobs, and no benefits.

Free market economy works best. Let it work.

2007-05-12 05:20:34 · answer #1 · answered by Anonymous 7 · 0 0

Since this is an essay to help justify your scholarship, it would be wise for you to provide a balanced perspective on whether corporations should be required to provide health insurance to full time employees. Present arguments for and against the topic, and write a paper that is slightly, but not too obviously, biased towards your company's advantage.

Corporations, in fact all business owners, would be wise to provide health insurance to their full-time employees! Doing so would portray the business as one that cares for it's employees, as well as making their medical expenses as a priority. Furthermore, it may even reduce the business costs in terms of reduced medical leave. It also creates the impression to the employees that their health concerns are managed by the company and serves as an enticement for employee loyalty.

Of course the flip side of it is the costs involved, abuse of the system by employees (eg getting medication for family members using the company doctors).

There are of course many other factors. You should cite a few examples for each factor to support the point.

Good luck!

2007-05-12 03:36:24 · answer #2 · answered by Kayceez 1 · 0 0

My opinion is no. I do not think employers should be "required" to provide benefits. However, most employers that want to attract and retain the better employees will provide these benefits to keep those good employees from leaving. I do know that some employers simply cannot afford to provide health insurance, even if they only pay 1/2 of the employee's premiums. If they were required to do so, they could not stay in business.

2007-05-13 03:56:53 · answer #3 · answered by nurse ratchet 6 · 0 0

No. It would only feed into the 'entitlement' mindset that many people are afflicted with today.

Most people are raised being told to 'get good grades, so you can grow up and good a secure job with benefits'. So they're living their life expecting some company or the government job to take care of them while they trade 40+years of service, hopeing at the end of which their retirement and benefits will support them. Ofcourse babyboomers are learning that isn't the case as more and more of them retire.

Bottom line. Your employer wants as much work out of you as it can get for the least amount of pay. That's business, period.

It's an individual's responsibility to plan out their own financial future, not anyone else's.

Ofcourse those subscribing to the 'entitlement' line of thought will not agree with this.

2007-05-12 03:25:36 · answer #4 · answered by kb 2 · 0 0

we must be doing the same essay, because i have to write an essay on this as well. i think i am going with the affirmative approach on this essay. i really think that employees if loyal and deserving, should be able to receive health insurance, especially if the corporation wants to show loyalty and love towards their employees.

2007-05-14 15:08:41 · answer #5 · answered by karmelcoloredprincess47 1 · 0 0

All these folsk saying NO. Already got theirs and do not care about you or anyoe else. Fact is, health care is an internation law. Basic health care and equal access to health care is a right we should be all entitled to, but are not.

Most of these people are probably union school teachers, etc. " I got mine, go screw yourself." Is their attitude. And the have the gull to speak of the "entitlement" attitude in America today.

Try going 20 years with healthcare, loose your job, own a home a retirement working hard all your life. At age 48, your company closes down, you loose your medical. You have a few assets so youqualify for no aid. You get cancer and you're entire life's work is wipped out overnight. This exact scenario happens to 400 americans each and every day!!!!!!!!!!!!!!

2007-05-12 11:38:20 · answer #6 · answered by Anonymous · 0 2

No. Employees are not entitled to benefits - they just sweeten the pot so you want to work there. Besides, if all employers offered the same benefits, nobody would follow their dreams in search of something better.

2007-05-12 10:27:05 · answer #7 · answered by zippythejessi 7 · 0 0

interior the U. S.? they could be quickly in CA - there are 3 pending products of law that could desire to do with offering minimum coverage for all workers. And reckoning on who our next President is, widespread wellness care could be a miles better priority than that's under our contemporary President.

2016-10-15 11:19:47 · answer #8 · answered by doloris 4 · 0 0

I think No

2007-05-12 03:26:48 · answer #9 · answered by Anonymous · 0 1

fedest.com, questions and answers