English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

There currently are two main domestic auto makers owned and operated in the US. Not too long ago there was three. However, Chrysler was bought by Daimler. With recent set backs to the "big two", Toyota and Honda have been making a play to become the top auto makers in the US. Do you think this is a long term negative for the US? The United States dominated the auto industry for decades and many other industries. Is that dominance being lost? What if the remaining US auto makers go bankrupt and the US is left with Toyota, Honda, and Daimler? Is Amerca stronger when it's citizens work in factories for foreign companies? Or is America stronger when Americans work for American owned and operated companies?

2006-07-02 08:04:20 · 3 answers · asked by Anonymous in Politics & Government Politics

3 answers

There is no changing the direction of the auto industry.As far as I know there hasn't been any American made cars in the USA for years. The parts were made in various countries around the world. The only hope for the auto workers is retooling the factories for different needs of the country. This is what Globalization, Nafta and Cafta has done to the ole USA. Good for the corporations bad for the loss of jobs to our skilled workers. Greed is what life has come down to.

2006-07-02 08:16:47 · answer #1 · answered by Anonymous · 1 1

I think it's gonna suck but the writing's been on the wall for a while, now. You can't be a real estate company AND a car company AND a bank etc...

2006-07-02 08:08:09 · answer #2 · answered by gokart121 6 · 0 0

Doom and gloom.

2006-07-02 08:13:16 · answer #3 · answered by Balthor 5 · 0 0

fedest.com, questions and answers