IMPERIALISM: UNITED STATES IN JAPAN

Picture
There was a time in world history when many different countries practiced Imperialism.  These masterminds dominated countres by taking over their economy, government, and religions.  One specific act of Imperialism was done on Japan by the United States.
Picture