English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

6 answers

Yes it is.

I work in healthcare in the UK and am amazed at healthcare in the USA. Almost every other western country has better health outcomes than the USA, look at the figures with life expectancy and infant mortality. How is that right for the richest and most powerful country in the world?

Do not get me wrong, the healthcare system we have in the UK is not perfect, but it is better than that in the US!

2007-12-13 01:25:19 · answer #1 · answered by The Patriot 7 · 0 0

Yes it is a business unless it is a non-profit institution.
And yes, it is a disturbing thought that Americans care more about profit than they do about other Americans.

2007-12-10 23:18:37 · answer #2 · answered by Anonymous · 1 1

Yes, if it is a business it becomes all about the bottom line and people's health care goes right out the window. We are seeing it now.

2007-12-10 23:18:15 · answer #3 · answered by slykitty62 7 · 1 1

I don't mind healthcare being a business.

But health insurance being a business bothers me.

2007-12-10 23:13:45 · answer #4 · answered by Steve 6 · 0 0

Yes, even more disturbing is that the democrat party wants the government to run it with our tax dollars!

2007-12-10 23:15:16 · answer #5 · answered by Dan K 5 · 1 4

No more than auto insurance being a
business.Why? What's the problem?

2007-12-11 00:56:03 · answer #6 · answered by Barry auh2o 7 · 0 2

fedest.com, questions and answers