English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

just general information about it.
why? when? how?
thanks!

2007-12-18 18:54:04 · 5 answers · asked by Anonymous in Arts & Humanities History

ok how about western colonization of Japan in the 19th century?

2007-12-18 19:10:36 · update #1

5 answers

???

I think the Japanese would be very surprised to learn they had been "colonized."

Perhaps you meant the occupation, which began on V-J Day in 1945. During this time Japan was under military control. Emperor Hirohito was allowed to remain in his palace, but with very little power.

The occupation lasted until about 1952 and during that time the Japanese were encouraged to draw up a constitution, which was adopted and formed the basis for the current government. Under this constitution Japan renounced war and its military services were authorized for defensive purposes only.

The Ryukyu Islands--including Okinawa--remained under U.S. control for another quarter century. During that time it was much like a colony, but became part of Japan in the '70's.

A few U.S. military bases remain in Japan, but Japan allows them largely because the U.S. has agreed to defend Japan in case it is attacked.

Japan remains politically and economically an independent nation.

2007-12-18 19:14:56 · answer #1 · answered by Warren D 7 · 2 0

no country has ever colonized Japan. it only looks like they were triumphed by imperialism of other countries because Commodore Mathew Perry was able to force japan to open its ports to the Americans. japan is just westernized but never colonized. with japans defeat in the world war, japan is not colonized, westerners just took over the country while they are fixing the treaties that will formally end the war.

2007-12-19 04:36:46 · answer #2 · answered by pao d historian 6 · 1 0

Japan was a closed nation. It allowed almost no outsiders. Then, in the year 1854, Commodore Perry steamed into Japan and signed a treaty that opened Japan to international trading. The irony is that by doing that, they were so impressed with steam run ships, Japan started to build a Navy. That was the act that led to modern Japan. There was never a colony.

2007-12-19 03:26:26 · answer #3 · answered by Songbyrd JPA ✡ 7 · 1 0

The US never colonized Japan, it never happened.

The United States is unique in the world in that after defeating an enemy in war it has never taken over the land of the defeated as it's own.

The United States has instead helped the defeated rebuild and renew.

You asked about Japan.

Look at the after World War II years in Japan and the economic rebuilding and expansion that took place.

We gave Japan an open opportunity to trade equally with us and we both have benefited from the trade.

My opinion is that the United States gave Japan the opportunity to grow beyond its citizen's expectations. The political climate changed from a authoritarian (fascist) empire to a democracy. A change the enabled capitalism to grow and flourish.

2007-12-19 03:19:47 · answer #4 · answered by Wrenched 7 · 1 1

Japan was a colony of the US??? I haven't heard of that. They may have some influence in economy, military and other stuff in Japan, but I haven't heard they colonized Japan.

2007-12-19 03:03:46 · answer #5 · answered by Anonymous · 1 1

fedest.com, questions and answers