By Mike Marion
It is difficult to pinpoint exactly when it happened, but clear evidence can be seen at least in the aftermath of World War II. Some trace the origins back to 1898 and the Spanish-American War, or even earlier to the War of 1812. And still others would say that imperial ambitions were even on the minds of some of the Founding Fathers. Regardless, there can be no doubt that today the United States of America is an empire.
It is probably safe to assume that most Americans do not think of their country as an empire. As a conservative in my younger years, I might have even labeled the suggestion as anti-American, rationalizing to myself: Sure, we may have strategic military bases around the world and we may use force at times, but it is only for benevolent purposes. We get the bad guys, give the country…
View original post 1 872 mots de plus