Skip to content ↓

Home

Was the United States of America a Colonial Empire in the late 19th and early 20th Centuries? - Alex Egan

The United States of America is consistently depicted as a republic, a staunch anti-imperialistic country. Ever since its birth, the country has fought empires starting with the British Empire in 1776, to the Nazi Empire, and the ‘Evil Empire’ of the USSR. The Monroe Doctrine of 1823 is clear documentation of the USA’s opposition to imperialism. The anticolonial essence of the USA in its history may place this façade over the reality of the USA’s imperialism. An empire is an aggregate of many separate states or territories under supreme rule, which contrasts with a federation, an extensive country voluntarily composed of states. This article will not argue that the USA currently is a hegemonic empire (where a country can dominate another country through influence), but it will argue that during the late 1800s and early 1900s the USA was a territorial, colonial empire akin to the European empires at the time.