English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

When did it become the government's job to advance Capitalism?

What happened to the "duty" to the people?

2007-06-09 05:38:44 · 14 answers · asked by Anonymous in Politics & Government Politics

14 answers

The Federal government's responsibility to the people is to only provide an atmosphere in which people have the opportunity to succeed.

2007-06-09 05:42:44 · answer #1 · answered by Brian 7 · 5 1

That was THE essential fight when the Constitution was being hammered out over the first 27 years of the USA. It was Hamilton and the New York bankers vs Jefferson and the people (farmers, shop keepers...you know US). The bankers won the first round, the Constitution offers NO personal protection from the government,banks, corporations. That is why Jefferson's election of 1800 was called "the second revolution"...he was able to with a majority in congress add "the Bill of Rights", those 10 amendments to the Constitution that actually give you your rights and protection from the government.

The bankers went to work immediately to void each of the 10 amendments and they made a big step in 1913 with the "Federal Reserve Act" and the "16th amendment" which created the income tax. The Federal Reserve Act put America's money system back into the hands of the New York/English bankers. The income tax amendment gives the government absolute power over anyone trying to make a living (look up Ed Brown).

The bankers plan was complete and running the show by 1961 when President Eisenhower warned us all, in his farewell speech, that this hobgoblin he called the "Military-Industrial Complex" was about to swallow America whole.

They are on their last bite and about to swallow...and the people they have sheepleized with their corporate media and "education" system don't even care.

2007-06-09 13:01:06 · answer #2 · answered by Perry L 5 · 1 0

This has been a point of contention almost since the creation of the u.s.
It wasn't until FDR's New Deal resolutions to help combat the Depression that people really started to see that the market wasn't an organic or natural entitiy - and people were what made the market fluxuate.
The government, even at the times when it has appeared to be helping "the little guy" has always been interested in keeping big business going because big businesses often donate a lot of campaign money and can put pressure on other countries etc...
Typically, a more liberal, leftist, or democratic government will try to take care of people more than a rightwing republican one - this is a gross generalization though...

2007-06-09 12:44:18 · answer #3 · answered by FIGJAM 6 · 2 0

The dream of America has mutated over the years. Big business though has been at the tip of all washington insiders as the most important for some time now. I would say that when Rockerfeller and Carnegie reached power in the 30's that this is when big business began to be more important. It is called the new world order and Bush II is very much a fan of it.

2007-06-09 12:50:53 · answer #4 · answered by WVU girl 2 · 1 0

Officially ?

January 20, 1981

2007-06-09 16:50:14 · answer #5 · answered by Peace Warrior 4 · 0 0

With Newt Gingrich and his contract on America. Money, so it's said is the root of all evil, yet there is one party in America that strives to collect as much as possible above all moral and ethical means. Our President is now trying his best to create a North American Union that would further advance this cause against the majority of Americans. We seem to have lost the ability to care about those Americans less fortunate in our society in the never ending race to collect Ben Franklin. Just as an aside, I find it rather funny that we print In God We Trust on money. Maybe it helps to ease the slimy business practices of the more unethical among us?

2007-06-09 12:53:18 · answer #6 · answered by Anonymous · 0 1

Business has always been important to our government but I think when corporations were declared to have the rights of citizens happened in a court case in I think the late 1800's.

2007-06-09 12:44:59 · answer #7 · answered by ash 7 · 0 1

Corporations are publicly owned businesses. They are owned by the share holders.

Capitalism is the basis of freedom. Only by allowing people to own their own lives can we truly live in liberty. The flip side is that there are no guarantees.

There is a duty for everyone to care for the disabled or orphans but with freedom comes both opportunity and responsibility.

We all seem to be able to list what rights we posses but when you ask what responsibilities accompany those rights, most seem to be strangely silent.

.

2007-06-09 12:56:18 · answer #8 · answered by Jacob W 7 · 0 2

Late 1940's or early 1950's. When we started with the cold war against communism. Even Lucile Ball, (I Love Lucy), was put on trial as being a communist.

2007-06-09 12:43:55 · answer #9 · answered by awake 4 · 0 0

It seems everywhere you turn you find a conglomerate." Make profit",and "Be number one " every minute of the day. It makes me sick to see the "Mom and Pop" operation is practically considered taboo and it's phased out.

2007-06-09 16:27:32 · answer #10 · answered by Anonymous · 0 0

fedest.com, questions and answers