Tuesday, May 17, 2011

The American Empire

My busyness lately has prevented me from posting too frequently.  However, I thought I would share an excerpt from an email I wrote to someone in response to the claim that the U.S. was traditionally "isolationist" (or, in the words of Pat Buchanan, "a republic, not an empire"):

I would have to say, the reason that America was more "isolationist" in the beginning was because it was essentially a dependent satellite of the British empire, benefitting from British imperialism and trans-Atlantic trade, with no real need for imperial actions of its own.  However, things started to change with the Great Depression of the late 19th century, which began to erode British economic hegemony and simultaneously create a need for cheaper raw materials (procured through colonization of the rest of the world), and thus opened the way for an emerging U.S. imperialism that was inaugurated by the Spanish-American War.  U.S. imperialism really took off, though, after WW2, when European hegemony was dealt a fatal blow and the colonized world gradually became independent... or maybe a better word than "independent" is available for re-colonization and exploitation by the U.S. and Soviet empire.  The U.S., of course, has been more subtle in the form of colonization it has employed... but even if control is exerted through transnational organizations and covert support of coups/destabilization of governments, it is control nonetheless!


The U.S. may not have become an empire until a century into its existence, but it became an empire precisely when there was room for it to become an empire. (Besides, does taking land from the Native Americans and Mexico not count as imperialism??) The fact of the matter is, for better or worse, U.S. growth has always been fueled by imperialism, whether of its own undertaking or that of its economic allies.

No comments:

Post a Comment