preview

Imperialism And The United States

Decent Essays

Imperialism is a recurring theme in the history of the world. Stronger countries see themselves as superior to other societies and believe their ways are right. They force religion, government, and practices on countless foreign lands. At the very end of the nineteenth century, the outcome of the Spanish-American war divided Americans into those for and against the annexation of the Philippines. The masses supporting it saw the islands as a strong foothold for the country in Chinese markets, compared to the minority which believed the United States was founded to escape oppression and should not lead to doing the same upon others. America becoming an imperial power shifted the world stage, and opened opportunity in trade on the other side of the world. Annexing the Philippines changed how other countries saw the United States, but more importantly it changed how Americans feel about their country; citizens rallied around the flag after defeating Spain because surpassing the empire’s navy gave them something positive to think of after all the turmoil over the past hundred years (and still at the time over working/living conditions and inequality between races/genders). It gave the public a reason to be patriotic. If the United States did not annex the Philippines when it did, it would not have had the ability to become the strongest government in the world and increase exports throughout Asia, but at what cost? American leaders decided it would be a good power move, but did

Get Access