Japan in the 1950s Alan Taylor March 12, 2014 39 Photos In Focus After Japan surrendered in 1945, ending World War II, Allied forces led by the United States occupied the nation, bringing drastic changes. Japan was disarmed, its empire dissolved, its form of government changed to a democracy, and its economy and education system reorganized and rebuilt. Years of reconstruction were required to rec