In the spring of 1774, two members of the Shawnee
tribe allegedly robbed and murdered a Virginia settler. As Thomas Jefferson
recounts in Notes on the State of
Virginia (1787), “The neighboring whites, according to their
custom, undertook to punish this outrage in a summary way.”
In their
quest for vengeance, the white settlers ambushed the first canoe they saw
coming up the river, killing the one, unarmed man as well as all of the women
and children inside. This happened to be the family of Logan, a Mingo chief,
Jefferson says, “who had long been distinguished as a friend of the whites,”
but who now took sides in the war that ensued. The Mingos fought—and
lost—alongside the Shawnees and Delawares against the Virginia militia that
fall, and Logan’s letter to Lord Dunmore after the decisive battle is,
according to Jefferson, a speech superior to “the whole orations of Demosthenes
and Cicero.”
“There
runs not a drop of my blood in the veins of any living creature,” Logan says of
his decision to fight the white men. “This called on me for revenge. I have
sought it: I have killed many: I have fully glutted my vengeance. For my
country, I rejoice at the beams of peace. But do not harbor a thought that mine
is the joy of fear. Logan never felt fear. He will not turn on his heel to save
his life. Who is there to mourn for Logan? Not one.”
Logan’s
speech went viral by eighteenth century standards; it was reprinted in
newspapers across the country and admired for its tragic eloquence. Its
popularity and resonance among white colonialists illustrate a defining aspect
of settler storytelling: an acknowledgement of the injustice of Indian killing
alongside an affirmation of its inevitability and salience as a guide to
action. In their authenticity, Logan’s words validated a structuring precept of
the white settler colony: that those who are violently displaced and eliminated
are distinct from kin, whose passing should be mourned, and also opaque to
posterity because they are sundered from webs of social relatedness.
Through
this sleight of hand, the settlers achieved a unique perspective—one that
justified violence because it afforded them a certain freedom, the productive
freedom of a blank slate. As historian Patrick Wolfe famously described it,
settler colonialism is thus a “structure, not an event.” Its mindset is not
backward but forward looking as it consciously blurs the lines between
preemption and self-defense, allegation and retribution, dispossession and
property right.
Consider
Jefferson’s words in the Declaration of Independence. A defining feature of
life in the “free and independent” states, he wrote, was constant warfare with
the denizens of a vast territorial frontier, “the merciless Indian Savages,
whose known rule of warfare, is an undistinguished destruction, of all ages,
sexes and conditions.” From its inauguration, then, American freedom was
founded on this unrelenting vision of a frontier populated by unjust enemies.
Jefferson’s founding brief for continuous expansionary warfare in the name of
collective freedom has animated the country’s sense of itself ever since. It
is, as political theorist Aziz Rana has noted, a foundational yet unexamined
precept within U.S. accounts of political liberty—one that continues to define
practices, institutions, and American ways of living that exact a violent toll.
Not
least, the Indian wars bequeathed a lasting military orientation—one that
extended and codified ethical, legal, and vernacular distinctions between
civilized and savage war as a core national experience and conceit. Settler
militias invested expansive police power in ordinary citizens as a corollary of
collective security, and justified practices of extirpative war focused on
populations and infrastructures, without distinction between combatants and civilians.
Through the more than two centuries of frontier and counter-insurgency wars
that the United States has fought (and continues to fight) the world over, the
elimination or sequestration of “savages” has been represented, Jodi Byrd
argues, as integral to the transit and development of American security, power,
and prosperity.
At the
end of the Civil War, Abraham Lincoln’s Secretary of State, William Seward,
claimed that “control of this continent is to be, in a very few years, the
controlling influence in the world.” Following the last Indian wars and the
closing of the territorial frontier at the end of the nineteenth-century,
President Theodore Roosevelt romanticized “the winning of the west” as the arc
of progressive history. He ridiculed the anti-imperialists of his day who
criticized brutal U.S. counter-insurgencies in the Philippines and Cuba as
sentimental dreamers who would give Arizona back to the Apaches. The settlers’
outlook and its understanding of freedom poses the question “would you want to
give it back?” to demonstrate an absurd proposition. The idea that there could
be such a thing as settler decolonization is not only impossible, but also
unthinkable.
By
connecting the concept of democratic self-rule with a continual project of
expansion, the settler narrative shaped collective institutions, ways of war,
visions of growth and prosperity, and conceptions of political membership that
still run deep. Indeed, our own period has not been immune. Describing the
supposedly unmatched achievements of liberal-democratic society at “the end of
history” in 1992, Francis Fukuyama reached back to the frontier allegory:
“mankind will come to seem like a long wagon train strung out along a road. . .
. Several wagons, attacked by Indians, will have been set aflame and abandoned
along the way. . . . But the great majority of wagons will be making the slow
journey into town, and most will eventually arrive there.” A decade later,
following 9/11, the mood had shifted, but not the narrative reflex. As George
W. Bush put it on October 6, 2001, “Our nation is still somewhat sad, but we’re
angry. There’s a certain level of bloodlust, but we won’t let it drive our
reaction. We’re steady, clear-eyed, and patient, but pretty soon we’ll have to
start displaying scalps.”
Defending
the launching of the global War on Terror, U.S. diplomatic historian John
Gaddis gave scholarly imprimatur to the settler idiom: the borders of global
civil society were menaced by non-state actors in a manner similar to the
“native Americans, pirates and other marauders” that once menaced the
boundaries of an expanding U.S. nation-state. Foreign affairs writer Robert
Kaplan concurred: “The War on Terrorism was really about taming the frontier,”
as he heard U.S. troops in Afghanistan and Iraq repeat the refrain, “Welcome to
Injun Country.”
The
reference, Kaplan insists, echoing Jefferson’s homage to Logan, “was never
meant as a slight against Native North Americans.” It was merely a
“fascination,” or an allusion to history—indeed, one that fits nicely with our
aptly named Tomahawk missiles and Apache helicopters. But these comments and
this history reflect a deeper, more sinister truth about the American
dependence upon expansionary warfare as a measure of collective security and
economic well-being.
The
history of the American frontier is one of mounting casualties and ambiguous
boundaries, of lives and fortunes gained and lost. In the settler narrative,
“collective security” never meant just the existential kind of safety, that is,
situations where material survival and self-defense were mainly at stake.
Freedom is essential to the equation, and freedom in this conception is built
once again upon dreams of a blank slate—this time cheap, empty, exploitable
lands and resources that must be cleared of any competing presence. Indeed, the
settlers’ conception of freedom belies the commercial interests in protecting
an investment prospectus: the speculative value of the land itself—what
surrounds it and what lies beneath it—is of paramount importance.
The main
colonial enterprise, after all, was risky and speculative land merchandising.
Early American governance was arguably more preoccupied with mundane
simplifications of deed and title, mapping, parceling, and recordkeeping than
it was with Indian fighting. From inception, the U.S. founders envisioned the
land west of the Alleghenies as a great commercial estuary, one that was
gradually emptied of any other human claimant. As George Washington, the land
speculator turned general, wrote upon resigning his command of the victorious
continental army in 1783 (which included organizing a campaign of ethnic
cleansing against the Iroquois), “The Citizens of America, placed in the most
enviable condition, as the sole Lords and Proprietors of a vast Tract of
Continent, comprehending all the various soils and climates of the World, and
abounding with all the necessaries and conveniences of life, are now by the
late satisfactory pacification, acknowledged to be possessed of absolute
freedom and Independency.”
Geographer
Thomas Hutchins echoed Washington’s sense of America as a world brimming with
valuable resources and directing human enterprise toward uncertain boundaries.
In An Historical Narrative and Topographical Description of Louisiana and West
Florida (1785), he takes stock of the land’s bounty: grapes, oranges, lemons,
cotton, sassafras, saffron, rhubarb, hemp, flax, tobacco, and indigo. Although
enslaved Africans, the producers of most of these agricultural commodities, go
unmentioned, Hutchins pauses impassively every few pages to observe a curious
feature he also attributes to the landscape: this or that “once considerable”
nation of Indians “reduced to about twenty-five warriors,” or “only about a
dozen warriors.” Indigenous expiry is thus quietly inscribed as necessary to
the continent’s supposedly inexhaustible riches.
U.S.
military pacification was only one tool for the diminution of Indian
sovereignty and the subsequent sequestration and marginalization of tribal
remnants. Extensions of federal plenary power, the legal recasting of Indian
political life as a peculiar subordinated status of domestic dependency, and
redefinitions of indigenous resistance and counter-violence as crime were also
central. Woven throughout was the settlers’ forward-looking framework: there is
no alternative. In his 1835 letter to the Cherokee people, for example,
President Andrew Jackson framed Indian removal as an essential by-product of
commercial growth. “Circumstances that cannot be controlled and which are beyond
the reach of human laws render it impossible that you can flourish in the midst
of a civilized community.” The true nature of those circumstances was revealed
five years prior in an address Jackson made to Congress: “what good man would
prefer a country covered with forests and ranged by a few thousand savages to
our extensive Republic, studded with cities, towns, and prosperous farms,
embellished with all the improvements which art can devise or industry execute,
occupied by more than 12,000,000 happy people?”
As the
Indian wars began drawing to a close in the late nineteenth century, the North
American territorial frontiers closed as well. Understood to be an event of
world-significance, it forced settler thinking to confront new challenges.
Diplomat Paul Reinsch, who was a student of Frederick Jackson Turner and an
early theorist of U.S. global reach, observed that expansion through overseas
colonialism would be uniquely difficult: “we have to deal with a fixed element,
the native population, long settled in certain localities and exhibiting deeply
engrained characteristics; a population . . . that cannot be swept away before
the advancing tide of Caucasian immigration as were the North American
Indians.”
In the
ensuing decades, then, a host of morbid symptoms arose from similar perceptions
that while expansion was necessary, it would never again be so easy and
unproblematic. For thinkers such as Madison Grant and Lothrop Stoddard (both
prominent eugenicists) the limitation of territories for future white
settlement and a “rising tide of color” threatened the supremacy, even
survival, of Western civilization. This meant that the United States itself
needed to seal its borders against unwanted detritus from the outer world. The
Chinese Exclusion Act of 1882, along with the subsequent Immigration Act of
1917 (which established an “Asiatic barred zone”), conjured fears of a “yellow
peril” that threatened to reverse the ordering virtue of white settlement in
western lands. Ruling on Chinese exclusion in 1909, the U.S. Supreme Court was
explicit, describing “foreigners of a different race” as “potentially dangerous
to peace” even in the absence of “actual hostilities with the nation of which
the foreigners are subject.”
In the
lead up to World War I, the challenge of how to continue a dynamic of economic
expansion in the wider world without becoming corrupted politically by
proximity to savage and inferior, non-white subjects was a central
preoccupation of U.S. thinkers. John Carter Vincent, a confidante of the Roosevelt
family and later a U.S. foreign service officer, suggested a vision of the
western hemisphere as the model for a “painless imperialism,” where nominal
sovereignty and separation from mestizo populations was underwritten by
strategically placed Marine barracks. This would ensure the smooth passage of
commerce and security for propertied interests and what Woodrow Wilson called
the election of “good men.”
In May
of 1942, after the United States had entered World War II, the editors of
Fortune, Time, and Life magazines published a joint statement titled “An
American Proposal” that echoed Vincent and Wilson. They observed that the
United States was not “afraid to help build up industrial rivals,” which they
saw as a virtue: “American ‘imperialism,’ if it is to be called that” is “very
abstemious and high minded . . . because friendship, not food, is what we need
most from the rest of the world.” These
leading business ideologues laid their cards on the table: an age governed by
aviation and “the logic of the air,” Fortune’s editors observed, would need an
extensive network of strategic bases and technical facilities similar to “the
colonies and dominions” that supported imperial Britain during its age of
maritime power. “In the world-to-be,” they warned, “a dozen or more equivalents
of Pearl Harbor may be simultaneously possible. . . . Our problem, therefore,
is not to restore the status quo ante, but to break out.”
Settler
colonial narratives thus needed to be rewritten to suit extra-territorial and global
purposes. To be clear, rising U.S. globalism and imperialism were not simply an
extension of settler freedom, but nor should we lose sight of how they were
intertwined with it. As Fortune’s writers insisted: “The U.S. economy has never
proved that it can operate without the periodic injection of new and real
wealth. The whole frontier saga, indeed, centered around this economic
imperative.” As such, “The analogy between the domestic frontier in 1787 when
the Constitution was formed and the present international frontier is perhaps
not an idle one.” Franklin Delano Roosevelt himself viewed the 1940 “destroyers
for bases” agreement with Great Britain—which saw the exchange of U.S. naval
ships for land rights on British possessions—as the most important action in
“the reinforcement of our national defense . . . since the Louisiana Purchase.”
A decade
later, as historian Megan Black has recently shown, engineers from the U.S.
Department of the Interior—with longstanding expertise charting Indian
reservation lands for hidden energy and mineral resources—were dispatched the
world over to survey sources of strategic minerals required to defend “the free
world.” In short order, U.S. military forces were calling Vietnam “Indian
Country,” forcibly sequestering its peasants on reservations, while fighting to
ensure its reserves of tungsten and tin didn’t fall to the red tide of
international communism.
U.S.
imperialism abroad, however, did not erase the influence of settler ethics and
practices closer to home. As Time Magazine magnate Henry Luce suggested, even
as non-interventionist sentiment ran high in the run up to World War II,
“Americans had to learn how to hate Germans, but hating Japs comes natural—as
natural as fighting Indians once was.” In turn, few events evoked the Indian
removal of the 1830s more than the 1940s herding of 100,000 Japanese and
Japanese-Americans into camps in the Western interior while many of their white
neighbors avidly claimed their farmlands and possessions.
An
expansive and celebratory vision of white settlement also retained its
purchase: by the 1950s, Andrew Jackson’s studded republic was remade through
the promise of homeownership on “the crabgrass frontier.” Working in
conjunction with real estate and banking industries, federal housing
authorities drew up “residential security maps” that identified with stark
red-lines where the valued property, credit, and people needed go—and where
untrustworthy denizens should remain fixed.
By the
late 1960s, as sharply racialized contests over public space and civic
belonging gave way to the “wars” on crime and drugs, sociologist Sidney
Willhelm foresaw that urban blacks in particular, who were no longer required
for industrial labor, were “going the way of the American Indian” into carceral
warehouses. It is hardly incidental that Michigan’s Oakland County Executive
Brooks Patterson thought it apt, quite recently, to characterize inner city
Detroit as a “reservation, where we herd all the Indians into the city, build a
fence around it, and then throw in the blankets and the corn.”
This
push and pull of U.S. settler ethics, narratives, and corollary institutions of
violence in the name of freedom has yielded a distinctive and multi-layered
carceral history and geography, at once domestic and transnational: a global
archipelago of prisons, internment camps, and detention centers. In the past
years, at Standing Rock, its raw circuitry of indigenous sequester and citizen
protection was once again laid bare as state police and U.S. military forces
had tense stand-offs with thousands of Sioux and supporters who were blocking
construction of the Dakota Access oil pipeline through Indian reservation
lands.
Here, we
might observe how settler ethics and practices continue to create liberated
citizens and subordinated subjects together; the former are defined by
democratic, formally egalitarian claims to nationhood, legal status, consumer
choice and protection, and the latter defined as atavistic, backward, passively
disappearing, slated for elimination, subject to sequestration, or bound by
what is thought to be permanent inferior status. “Savagery,” in short, has been
a fungible and centrifugal construct, with fears of the native fueling racism
as well as nativism, while a recursive, blank-slate conception of settler
primacy and preeminence animates movements, programs, and policies for
eliminating or warding off alien or foreign presence.
The
inceptive structuring of indigenous elimination as a condition of the settlers’
freedom has yielded an enduring tendency among American officials, and among
the publics they conscript, to think of democratic self-rule as interdependent
with expansive and coercive rule over alien subjects. After 9/11, this
historical subtext returned to the foreground as Americans were told not only
that fighting terrorists overseas meant not having to fight them at home, but
also that continuing to shop and spend at home was no less the duty of a
civilized and prosperous people. The term “enemy combatant” itself was a
neologism invented for “unlawful” fighters, those deserving no legal standing
or status—those who could be detained (and tortured) with impunity—those
subject to an unlimited deprivation of freedom, one whose avowed legal
precedent, once again referred back to the Indian wars.
As
inhabitants of a finite and ecologically stressed planet, the challenges of
undoing settler ethics—its ways of war, its presumptions about a need for
limitless growth, its hostile vision of blank slate autonomy without
dependency, and its delimitations of social and political membership—have never
been higher. For more than simple racism or discrimination, the destructive
premise at the core of the settler narrative is that freedom itself must be
built upon eliminationism, and that growth therefore requires expiry.
And it
this temptation—to remain on the right side of might that makes right—that
stalks the future of a planet in the grips of climate destruction, secular
stagnation, and unevenly distributed misery. Earthly co-existence, material subsistence,
and ecological sustainability demand nothing less than a new dispensation of
human freedom. Otherwise, there truly will be none left to mourn.
On September 21, 1945—five months after Franklin
Roosevelt’s death—President Harry Truman assembled his cabinet for a meeting
that one historian has called “a turning point in the American century.” The
purpose of the meeting was to discuss Secretary of War Henry Stimson’s proposal
to share atomic bomb information with the Soviets. Stimson, who had directed
the Manhattan Project, maintained that the only way to make the Soviets
trustworthy was to trust them. In his proposal to Truman, he wrote that not
sharing the bomb with the Soviets would “almost certainly stimulate feverish
activity on the part of the Soviets . . . in what will in effect be a secret
armament race of a rather desperate character.”
Henry Wallace, the secretary of commerce and former
vice president, agreed with Stimson, as did Undersecretary of State Dean
Acheson (though he later changed his position), but Secretary of the Navy James
Forrestal laid down the definitive opposition. “The Russians, like the
Japanese,” he argued, “are essentially Oriental in their thinking, and until we
have a longer record of experience with them . . . it seems doubtful that we
should endeavor to buy their understanding and sympathy. We tried that once
with Hitler. There are no returns on appeasement.” Forrestal, a skilled
bureaucratic infighter, had made his fortune on Wall Street and frequently
framed his arguments in economic terms. The bomb and the knowledge that
produced it, Forrestal argued, was “the property of the American
people”—control over it, like the U.S. seizure of Japan’s former Pacific Island
bases, needed to be governed by the concept of “sole Trusteeship.”
Truman sided with Forrestal. Stimson retired that very
same day, his swan song ignored, and Wallace, soon to be forced out of the
Truman administration for his left-wing views, described the meeting as “one of
the most dramatic of all cabinet meetings in my fourteen years of Washington
experience.” Forrestal, meanwhile, went on to be the country’s first secretary
of defense in 1947 and is the man who illustrates perhaps more than anyone else
how Cold War militarism achieved its own coherence and legitimacy by adopting
economic logic and criteria—that is, by envisioning military power as an
independent domain of capital expenditure in the service of a political economy
of freedom. From his pivotal work in logistics and procurement during World War
II, to his assiduously cultivated relationships with anti–New Deal congressmen
and regional business leaders sympathetic to the military, Forrestal both
helped to fashion and occupied the nexus of an emerging corporate-military
order. He only served as defense secretary for eighteen months (he committed
suicide under suspicious circumstances in 1949), but on the day of that fateful
cabinet meeting, he won the decisive battle, advocating for what he once called
a state of ongoing “semi-war.” The post–World War II rise of a U.S.
military-industrial complex is well understood, but it still remains hidden in
plain sight. Today warnings about Donald Trump’s assault on the “liberal
international order” are commonplace while less examined is how we arrived at a
point where democratic and “peacetime” governance entails a global military
infrastructure of 800 U.S. military bases in more than 70 countries.
Moreover, this infrastructure is under the command of
one person, supported by a labor force numbering in the millions, and oriented
to a more-or-less permanent state of war. If a politics of threat inflation and
fear is one part of the answer, the other, more prosaic component is that the
system itself is modeled after the scope of business and finance. By managing a
diverse portfolio of assets and liabilities and identifying investment
opportunities, it envisions a preeminently destructive enterprise as a series
of returns calibrated to discretionary assessment of threats and a
preponderance of force. This was Forrestal’s bailiwick.
A little-known anecdote about Truman’s 1947 call to
Congress for decisive intervention in the Greek civil war—generally viewed as
the official declaration of the Cold War—illustrates this point. Truman’s
speech is famous for its emphasis on political freedom, particularly the idea
of protecting peoples’ rights to self-determination against “armed
minorities”—“the terrorist activities of several thousand armed men, led by
communists.” “One of the primary objectives of the foreign policy of the United
States,” Truman said, establishing the characteristic linkage between World War
II and the Cold War, “is the creation of conditions in which we and other
nations will be able to work out a way of life free from coercion. Our victory
was won over countries which sought to impose their will, and their way of
life, upon other nations.”
Truman’s delivered address, by contrast, made use of
the words “free” and “freedom” twenty-four times in a few minutes, as if
talismanic repetition were enough to hinge the defense of private capital
accumulation to the maintenance of popular democracy the world over. Yet,
despite the inflated rhetoric, economic considerations remained the skeletal
core of the Truman Doctrine. Buried inside the address was the acknowledged
collapse of British imperial policy in the region, along with an “invitation”
from a dubiously democratic, right-wing Greek government for “financial and
other assistance” in support of “better public administration.” The imperatives
of democracy and self-government—preeminent political values understood by the
U.S. public—were subordinated to building “an economy in which a healthy
democracy can flourish.” In a final nod to the bean counters, Truman noted that
the amount he was requesting was a mere fraction of what the United States
spent during World War II, and no less justified as “an investment in world
freedom and world peace.”
The following year, for example, George Kennan, author
of the “containment” doctrine, a protégé of Forrestal, and the single most
influential strategic foreign policy thinker of the moment, offered a
strikingly candid version of the task at hand, in a classified memo that
consciously punctured the universalist ambit of the Truman Doctrine:
’’ We have about 50% of the world’s wealth but only 6.3% of its
population. This disparity is particularly great as between ourselves and the
peoples of Asia. In this situation, we cannot fail to be the object of envy and
resentment. Our real task in the coming period is to devise a pattern of
relationships which will permit us to maintain this position of disparity
without positive detriment to our security. To do so, we will have to dispense with
all sentimentality and day-dreaming; and our attention will have to be
concentrated everywhere on our immediate national objectives. We need not
deceive ourselves that we can afford today the luxury of altruism and
world-benefaction. (emphasis added)””
When thinking about nations and peoples, particularly
those outside of Europe, Kennan again foregrounded a logic of investment and
risk management, and he advised restraint and limitation of liability, espescially with respect to “the peoples of Asia . . . [who] are going to go ahead,
whatever we do, with the development of their political forms and mutual
interrelationships in their own way.” Kennan warned that the coming period
would be neither “liberal” nor “peaceful,” and that such countries were likely
to “fall, for varying periods, under the influence of Moscow, whose ideology
has a greater lure for such peoples, and probably greater reality, than
anything we could oppose to it . . . [or that] our people would ever willingly
concede to such a purpose.” In this light, he concluded that the United States
needed to dispense with commitments, rhetorical and otherwise, to “unreal
objectives such as human rights, the raising of living standards, and
democratization. The day is not far off when we are going to have to deal in
straight power concepts.”
This view is sometimes depicted as an exemplary
instance of realism—wiser and more in tune with the messy, uneven world that
emerged from World War II—and a point of view that, had it been heeded, may
have prevented the costly overreach of global cold war, especially “blunders”
such as the Vietnam War (which Kennan, long retired to academia, opposed). The
concept of realism, however, fails to grasp the functional logic of risk and
threat assessment—the insistent and anxious hedging and speculation that made
the careers and fortunes of Kennan, Forrestal, and many that followed them.
Forrestal fretted obsessively in his diary along these lines: “I am more
impressed than ever as things develop in the world today that policy may be
frequently shaped by events unless someone has a strong and clear mental grasp
of events; strong enough and clear enough so that he is able to shape policy
rather than letting it be developed by accidents.” This recurrent epistemic
anxiety initiated an insistent demand for anticipatory policy, abiding
mistrust, and the maintenance of a preponderance of force. As Forrestal bluntly
put it, “Power is needed until we are sure of the reign of law.”
Forrestal framed his own deference for hierarchy in
terms of the prerogatives of corporate capitalism—the idea that practical men
of business, rather than reformers and intellectuals, had won World War II and
needed to be running the world going forward. Among his more forceful
conclusions was that liberal globalism would be disastrous if it were not
steeled with counterrevolutionary animus. As he confided to diplomat Stanton
Griffiths:
“”Between
Hitler, your friends to the east, and the intellectual muddlers who have had
the throttle for the last ten years, the practical people are going to have a
hell of a time getting the world out of receivership, and when the miracles are
not produced the crackpots may demand another chance in which to really finish
the job. At that time, it will be of greatest importance that the Democratic
Party speaks for the liberals, but not for the revolutionaries.”
For these realists, even more than the wooly moralists
they sometimes ridiculed, it was the credibility of U.S. threats of force that
ensured the freedom and mobility of productive capital and supported its
resource needs and allied interests across an ever-widening sphere. Of a more
aristocratic and consciously anti-democratic mien, Kennan likewise recognized
that the animating logic was not strictly anti-communist but
counterrevolutionary—indeed even racial. The inevitable dissolution of the
colonial system meant that the challenge of U.S. policy in the coming period
was broader than the struggle with Soviet communism, as “all persons with
grievances, whether economic or racial will be urged to seek redress not in
mediation and compromise, but in defiant, violent struggle.” Inspired by
communist appeals, “poor will be set against rich, black against white, young
against old, newcomers against established residents.”
By eliding soviet designs with those of heterogeneous
movements demanding effective sovereignty and challenging material deprivation,
Forrestal and his colleagues contributed to a perverse recasting of the dynamic
of European colonial disintegration as the field of Soviet imperial expansion.
This rhetorical and ideological frame practically demanded the militarization
of U.S. foreign policy, with U.S. “counterforce” the only alternative to a
world ruled by force. As such, along with Arthur Radford, Forrestal was
instrumental in developing the Central Intelligence Agency (CIA), and that
agency’s work soon echoed his. In 1948, for instance, a CIA document entitled
“The Break-Up of Colonial Empires and its Implications for US Security” defined
expressions of “economic nationalism” and “racial antagonism” as primary
sources of “friction between the colonial powers and the US on the one hand,
and the states of the Near and Far East on the other.”
The CIA’s analysts suggested that poverty and a legacy
of anti-colonial grievances rendered colonized and formerly colonized peoples
“peculiarly susceptible to Soviet penetration” and warned that the “gravest
danger” facing the United States was that decolonizing nations might fall into
alignment with the USSR. At the same time, they faulted Europe’s colonial
powers for their failure to satisfy “the aspirations of their dependent areas”
and advised them to “devise formulae that will retain their good will as
emergent or independent states.” Envisioning U.S. responsibility to author such
formulae in the future, the classified brief concluded that the United States
should adopt “a more positive and sympathetic attitude toward the national
aspirations of these areas,” including policy that “at least partially meets
their demands for economic assistance.” Otherwise “it will risk their becoming
actively antagonistic toward the US,” including loss of access to previously
“assured sources of raw materials, markets, and military bases.”
While the emerging U.S. foreign policy clearly
accepted the un-resolvable antagonism toward the Soviet Union, the challenge of
the future, as the CIA argued, was how the United States should address the
“increasing fragmentation of the non-Soviet world,” or, in a word,
decolonization. The means for assessing risk and reward in this expansive and
heterogeneous terrain of imperial disintegration were by no means clear. But it
is revealing that the possibility of potential alignments between decolonizing
nations and Soviet power was far less concrete and worrisome to the United
States than the more definite and delineated material losses faced by the
United States and the colonial powers with which it had aligned itself—namely,
being deprived access to formerly “assured sources of raw materials, markets
and military bases.” In other words, the challenge of the future, as Kennan had
underlined, was to devise “formulae” to buttress the forms of political
authority that sustained economic inequality (at a world scale) in the face of
inevitable revolt and revolution against such authority and the social
conditions it supported.
Despite his later misgivings, Kennan had authored the
concept whose rhetorical elasticity and ideological indeterminacy proved
crucial to fashioning a nemesis that suited this consciously expansionist
vision of U.S. economic and military power. With the creation of the CIA, the
National Security Council, and Forrestal’s own new position of secretary of
defense, these years saw the growth of a national security bureaucracy that was
divorced from meaningful oversight and public accountability for its actions,
including myriad moral failures and calamities. A covert anti-Soviet
destabilization campaign in Eastern Europe, for example, greenlit by Forrestal
and Kennan, enlisted Ukrainian partisans who had worked with the Nazis. This
type of activity would become routine in Latin America, Asia, and Africa, where
Kennan derided respect for the “delicate fiction of sovereignty” that
undeserving, “unprepared peoples” had been allowed to extend over the resources
of the earth.
Over the next quarter century, fewer than 400
individuals operated the national security bureaucracy, with some individuals
enjoying decades of influence. That the top tier was dominated by white men who
were Ivy League–educated lawyers, bankers, and corporate executives (often with
ties to armament-related industries) lends irony to official fearmongering
about armed conspiracies mounted by small groups, let alone the idea that the
role of the United States was to defend free choice against coercion imposed by
nonrepresentative minorities. This fact, perhaps more than any other, suggests
that, as much as the Cold War represented a competition between incompatible,
if by no means coeval or equally powerful systems of rule (i.e., communist and
capitalist), it was marked by convergences too. The Soviet “empire of justice”
and the U.S. “empire of liberty” engaged in mimetic, cross-national
interventions, clandestine, counter-subversive maneuvers, and forms of
clientelism that were all dictated by elite, ideologically cohesive national
security bureaucracies immune from popular scrutiny and democratic oversight.
Those charged with governing the controlling seat of
U.S. globalism consistently doubted the compatibility of normative democratic
requirements and the security challenges they envisioned, including distrust
that often bordered on contempt for the publics in whose name they claimed to
act. “We are today in the midst of a cold war, our enemies are to be found
abroad and at home,” remarked Bernard Baruch, coining the term that names this
era. In this context, “the survival of the state is not a matter of law,”
Acheson famously declared, an argument similar to one being advanced by former
Nazi jurist Carl Schmitt. Vandenberg, echoing defenders of Roosevelt’s
accretive accumulation of war powers, was positively wistful lamenting “the
heavy handicap” that the United States faced “when imperiled by an autocracy
like Russia where decisions require nothing but a narrow Executive mandate.”
For Forrestal, “the most dangerous spot is our own country because the people
are so eager for peace and have such a distaste for war that they will grasp
for any sign of a solution of a problem that has had them deeply worried.”
Forrestal felt that the danger at home manifested
itself most frustratingly in the threat that congressional budgeting posed to
military requirements. The preservation of a state of peace was a costly
proposition when it revolved around open-ended threat prevention the world
over. Upholding the permanent preponderance of U.S. military power at a global
scale required a new type of fiscal imagination, one that had to be funded by
the future promise of tax receipts. During his final year in office,
Forrestal’s diary records in mind-numbing detail his worries about acquiring
Pentagon funding adequate to his projections for global military reach. In Forrestal’s
view, budgetary considerations were captive to the wrong baseline of “peak of
war danger” and combatting “aggression” rather than to “maintenance of a
permanent state of adequate military preparation.”
A fascinating aspect of these budget wrangles is Forrestal’s
manic efforts to translate future-oriented geostrategic needs into precise
dollar values. Just months before his forced retirement and eventual suicide,
he confided to Walter G. Andrews:
’ Our biggest headache at the
moment, of course, is the budget. The President has set the ceiling at 14
billion 4 against the pared down requirements that we put in of 16 billion 9. I
am frank to say, however, I have the greatest sympathy with him because he is
determined not to spend more than we take in in taxes. He is a hard-money man
if ever I saw one.’”
Despite his grudging admiration for the stolid Truman,
Forrestal’s Wall Street background had left him at ease in a more speculative
or liquid universe; at that precise moment, he was devising accounting gimmicks
to offset near billion-dollar costs of stockpiling raw materials as a “capital
item” that could be “removed from the budget.” The important point to emphasize
is the relationship between two interrelated forms of speculation and
accounting—economic and military—in which an absolute inflation of threats
tempted a final break with lingering hard-money orthodoxies and a turn to
deficit spending. Forrestal did not live to see the breakthrough, but his work
paid off.
As Acheson described it, the Korean War—the first hot
war of the Cold War era—“saved” the fledgling national security state. With its
outbreak, the dream of eternal military liquidity was realized when Leon
Keyserling, the liberal economist serving as Truman’s chairman of the Council
of Economic Advisors, argued that military expenditures functioned as an
economic growth engine. That theory then underpinned NSC 68, the document that
justified massive U.S. defense outlays for the foreseeable future and which was
authored by another Forrestal protégé, Paul Nitze. By yoking dramatically
increased federal spending to security prerogatives, military Keynesianism thus
achieved a permanent augmentation of U.S. state capacity no longer achievable
under appeals to Keynesianism alone.
The embedding of the global priorities of a national
security state, which sometimes appears inevitable in retrospect, was by no
means assured in the years leading up to the Korean War. It was challenged by
uncooperative allies, a war-weary or recalcitrant U.S. public, and politicians
who were willing to cede U.S. military primacy and security prerogatives in the
name of international cooperation. But by 1947, men such as Forrestal had laid
the groundwork for rejecting the Rooseveltian internationalist inheritance,
arguing it was necessary to “accept the fact that the concept of one world upon
which the United Nations was based is no longer valid and that we are in political
fact facing a division into two worlds.” Although the militarization of U.S.
policy is often understood to have been reactive and conditioned by threats
from the outside, his ruminations illustrate how militarized globalism was
actively conceived as anticipatory policy (in advance of direct confrontations
with the Soviet Union) by just a few architects and defense intellectuals—men
under whose sway we continue to live and die.
Ultimately, the declaration of the Cold War says more
about how these U.S. elites represented and imagined their “freedom” and
envisioned the wider world as a domain for their own discretionary action and
accumulation than it did about enabling other people to be free, let alone
shaping the terms of a durable and peaceful international order. As early as
1946, Forrestal began taking important businessmen on tours of the wreckage of
Pacific Island battles, which also happened to be future sites for U.S. nuclear
testing. Forestall described these ventures as “an effort to provide long-term
insurance against the disarmament wave, the shadows of which I can already see
peeping over the horizon.” The future of the bomb and the empire of bases were
already on his mind.
Forrestal recognized that force and threat are always
fungible things to be leveraged in the service of the reality that truly
interested him, the reality made by men who own the future. For those of his
cast of mind, “international order” was never more than the fig leaf of wealth
and power. As he noted in a 1948 letter to Hansen Baldwin of the New York
Times: “It has long been one of my strongly held beliefs that the word
‘security’ ought to be stricken from the language, and the word ‘risk’ substituted.
I came to that conclusion out of my own business experience.” It was the job,
after all, of these East Coast lawyers and moneymen to make sure all bets were
hedged, and Forrestal knew that speculation could turn into “an investment gone
bad.” As a leading investor in the Cold War project, he wanted a guaranteed
return, even if the rule of law never arrived and even when the price was ruin.
Banking on the Cold War. By Nikhil Pal Singh. Boston Review, March 14 , 2019.
Nikhil
Pal Singh is Professor of Social and Cultural Analysis and History at New York
University and Faculty Diretor of the NYU Prison Education Program.
No comments:
Post a Comment