Thanks to their schooling and to the broader culture that surrounds them, American children become familiar with a rather contradictory characterization of their country. They're encouraged to see it as both a benevolent, freedom-loving nation and as a "superpower." What in the world, some kids might wonder, does America do with all of that "power"?
Some of these children grow up to realize that America does a lot with that power, much of it in its own, abusive self-interest. However, despite the awareness that some Americans have of their country's global abuses, very few think of the United States as an "empire." In my experience, using that word to describe their country strikes most Americans as bizarre.
American children usually gather that the word "empire" has old-fashioned connotations, bringing to mind such former empires as those of Rome, Spain, and England. Empires, we're told, turn other countries into colonies, and their people into slaves, all for the selfish, resource-grabbing benefit of those in the central, ruling country, the "seat of empire."
On the other hand, teachers and textbooks say, America began its very existence in resistance to colonial subjugation. And when America later exerted its military might overseas, it did so primarily to defeat the empire-building efforts of other nations--such as Germany, Japan, and the Soviet Union--and not to further its own imperial ambitions.
Few Americans now doubt the claims that their leaders have always made on this subject, that when America intervenes overseas, it does so reluctantly, and because it's a democratic, "freedom-loving" nation. Most hear nothing wrong with comments like this one, by former Secretary of Defense Donald Rumsfeld: "We don't seek empires. We're not imperialistic. We never have been."
American children do usually learn that America has a long and steady history of aggressive involvement, military and otherwise, in the affairs of other nations, especially in two World Wars. However, the general impression they gather is that because America is a benign force in the world, these interventions are always driven by the purely altruistic motive of spreading freedom and democracy.
More recently, America is thought to have been drawn into its first war with Iraq, led by the first George Bush, because Kuwait needed our help after Saddam Hussein invaded it. George the Second was thought to have justifiably invaded Afghanistan because it was sheltering the mastermind of the 9/11 attack, Osama bin Laden, and the second war with Iraq supposedly started because Hussein was somehow threatening his neighbors and America with hidden weapons of mass destruction and plans for nuclear weapons.
Many Americans now realize that the recent claims about Iraq were false, and that their country acts on its own interests in that "region," especially energy interests. But since Saddam was so easy to portray as a dictator who "gassed his own people," the current conflict in Iraq is still seen by many as a defensible demonstration of America's freedom-loving, democracy-spreading intentions. And as with the ongoing American efforts in Afghanistan, that conflict is certainly not widely understood as the racist, resource-grabbing attempt of an empire to permanently occupy and control another country.
That American efforts in both countries aren't going well is attributed not to general, justifiable resistance to U.S. occupation and control. Nevermind, we're led to think, that American forces toppled the governments of those two (and many other) countries, and is currently building permanent bases in both countries, and is thus "occupying" them instead of fighting a war with them. Those two conflicts remain, in the minds of most Americans, "wars," and in the minds of many, noble wars, because again, we're spreading freedom and democracy.
Aside from this common delusion regarding American beneficence, two other elements in the general American identity help to explain both the spread of American empire, and its general denial--white supremacy and Christianity. From at least as far back as the Spanish conquests of indigenous people, to the current so-called War on (Arab/Muslim) Terror, leaders have justified their conquests of other people by convincing their populations (and often themselves) that darker people with desirable land, resources, and labor were inherently inferior.
Today's American children are routinely taught that their country's vague, almost organic movement westward in the mid-1800s was called "Manifest Destiny," but they're rarely taught that achieving dominion from coast to coast was openly thought of at the time, and labeled, as the destiny made manifest of the superior white, "Anglo-Saxon" race. That racist national identity made it easier to basically steal enormous amounts of land from Native Americans, and from Mexico. Despite the country's ever-increasing racial diversity, the word "American" has continued to register as "white"; as a result, the current deaths of, by some estimates, a million or more Iraqis and Afghans, would be less easy for most Americans to stomach if the dead were less dismissible, as somehow less than fully "civilized" and human, because they weren't white.
Regarding the Christian influence on American empire, and on its general denial, Robert Jensen provides a useful phrase (in his book Citizens of Empire), "the pathology of the anointed." As Jensen explains,
The story we tell ourselves goes something like this: Other nations throughout history have acted out of greed and self-interest, seeking territory, wealth, and power. They often did bad things in the world. Then came the United States, touched by God, a shining city on the hill, whose leaders created the first real democracy and went on to become the beacon of freedom for the rest of the world. Unlike the rest of the world, we act out of a cause nobler than greed; we are both the model of, and the vehicle for, peace, freedom, and democracy in the world.
Many people outside of America, and a few of those within it, recognize the common American conception of their country's place and actions in the world as not only childishly simple, but also flatout wrong. As Jensen also writes, "This is a story that can be believed only in the United States by people sufficiently insulated from the reality of U.S. actions abroad to maintain such illusions."
I think an empire can be defined quite simply, as a country that controls other countries for its own gain; that country needn't openly declare itself an empire in order to be one. Aside from using various covert and overt methods to install foreign leaders who are friendly to American "interests," the U.S. exerts various levels of self-interested control over other countries with military might. America currently has over 730 foreign military bases and 2,500,000 overseas personnel, spanning across more than 130 countries. The U.S. has bombed 22 other countries, and that's after the Second World War. It also manipulates the politicians and economies of many countries, in many ways. Knowing such things about the U.S. makes it tough to think of it as merely one country among many, instead of as an empire.
Nevertheless, Americans continue to delude their children by teaching them that their country is exceptional, and only in good ways. They're led to believe that "Number One!" means the best, instead of the most dominant and abusive. By teaching our children this way, we not only render them delusional; we also rob them of a realistic understanding of their place in the world, and especially of their direct, abusive position in relation to other, less fortunate people.
By the way, if you know a child (or an adult) who would could use a more realistic view of America's place in the world, I have a book to recommend. The following video offers excerpts from a great graphic/comic book, Howard Zinn's A People's History of American Empire. This video also offers (as does the book) a concise, accessible summary of America's lengthy imperial ambitions.