Monthly Archives: October 2013

Tricks, Treats, and Shopping: America’s Halloween History

Americans have turned Halloween into a consumerist golieth.

Americans have turned Halloween into a consumerist Goliath, because that’s what they do.

Halloween. It’s a holiday anticipated and embraced with equal fervor by kids craving an unmitigated sugar rush, by adults looking for an excuse to dress up like creepily-eroticized pop-culture characters, and by dentists craving sugar-induced high insurance deductibles.

Halloween is a big deal in America today. For a hyper-materialistic society that long ago replaced agricultural rhythms with consumer totems as markers of the seasonal cycles, the first appearance of Halloween paraphernalia in shopping centers signals the transition from summer to fall. Moreover, American society is rife with contradictions created by major disconnects between ideals and reality on issues ranging from marriage, to sex education, to economic mobility. Halloween’s emphasis on duality and the inversion of traditional social customs, therefore, appeals to Americans caught up in these webs of contradictions because it effectively sanctions misbehavior and the inversion of “traditional” norms. In this respect, Halloween — at least temporarily — validates Milton’s famous line that “its better to reign in Hell than to serve in Heaven.”

Beyond the sanctioning of revelry, however, Halloween’s popularity in America also stems from its sheer marketability: it provides super-charged fuel for the capitalist engine. Bloomberg Businessweek recently reported that a whopping 66% of Americans — nearly 158 million people — celebrate Halloween, and they’ll spend an impressive $6.9 billion in the process. Americans, more than any other previous world civilization, have demonstrated a remarkable talent for turning even the most culturally rich celebrations into a series of mundane monetary exchanges. So they have done with Halloween; turning an ancient pagan ritual into an excuse to buy mountains of costumes, candy, and decorations. By providing a limited time-period for both the controlled inversion of social norms and the relentless stoking of the capitalist marketplace’s fires, Halloween has assumed a hallowed (see what I did there?!) role in American culture.

Of course, you can’t blame Americans entirely for commercializing Halloween. The holiday’s history made it ripe for this type of cultural appropriation by providing an excuse to let humankind’s many demons run wild once a year. Halloween’s roots can be traced back to the British Isles and the ancient Celtic celebration of Samhain, the New Year’s Day on the Celtic calendar.

Like modern-day Halloween, Samhain corresponded with the harvest, and thus served as a major yearly transition between the seasons that acknowledged the coming of winter. Samhain’s association with the death of crops and encroaching darkness made it rife with the symbolism of life and death. As folklorist Jack Santino observes, Samhain “associated the fruits of the harvest with ideas of the afterlife and the otherworld.”* Samhain was a time of transition, when the veil between earth and the spirit world was thinnest. On Samhain Eve, the Celts lit bonfires and laid out harvest gifts for the travelling souls of the dead passing through the corporeal plane on their way to the next realm. The association of Samhain with the dead lives on in Halloween’s celebration of ghosts and ghouls.

Ancient legends associated with Samhain also provided the template for trick-or-treating that came to so define Americans’ consumerist approach to Halloween. One such story described a hero named Nero who, while begging from door-to-door on Samhain, discovered a cave leading into the fairy realm. This story established the idea that Samhain was a time that permitted access to the otherworld. In another tale, a supernatural race called “Fomorians” demanded tribute from Celtic mortals, who obliged by offering harvest fruits to these Gods at Samhain.

As Santino notes, paying tribute to the gods echoed the folk custom of leaving out gifts for wandering spirits, a practice that was, in turn, recreated via the custom of mumming (stemming from the Danish word mumme: to parade in masks). In the practice of mumming, patrons gave food and drink to wanderers disguised to imitate spirits. Santino notes that “the ideas of the dead wandering the earth begging food and the giving of food and drink in tribute and as payment to wandering spirits” created the template for contemporary trick-or-treating.*

A traditional Irish Samhain turnip jack-o'-lantern. Creepy, ain't it?
A traditional Irish Samhain turnip jack-o’-lantern. Creepy, ain’t it?

Early in the fifth century, Christian missionaries came to the British Isles and tried to transform the pagan ritual of Samhain into a Christian holiday. Missionaries branded Samhain’s supernatural entities as elements of the Devil. Fairies became fallen angels; the wandering dead became more malicious; the Celtic underworld became the Christian Hell, and followers of pagan beliefs were branded as witches. Yet, even after the Catholic Church established November 1st as All Saints Day (also known as All Hallows, or sanctification) and November 2 as All Souls Day, the old pagan traditions lived on. People continued to pay tribute to the wandering dead on All Hallows Eve by setting out food and drink. All Hallows Eve was also referred to as All Hallow Even or Hallowe’en, and is still celebrated with many of the old customs intact.*

The All Hallows Eve tradition of masquerading as spirits, when combined with old Celtic traditions of hollowing out fall vegetables and illuminating them with candles to ward off the dead, provided the right combination that allowed Americans to transform Halloween into a pageant of mischievous masked revelry and orgiastic consumerism.

In the nineteenth century, the development of a more urbanized market economy facilitated a growing urban/rural divide that still exists today. As more Americans moved off of the farm, harvest fruits increasingly came to serve as consumed representations of a lost rural past to be displayed in urban and suburban built environments. Santino cites perennial American fall trips to the countryside to buy pumpkins to carve into jack-o’-lanterns as symbolic of Americans’ transforming natural objects into modified ones. “Once transformed,” Santino writes, harvest fruits like pumpkins “are not strictly tied to the organic base and can be rendered in other media.”* In modern America, the rendering of Halloween harvest fruits into other media occurs in the form of the cavalcade of Halloween decorations, candy, and mass-produced costumes. Halloween is highly suited to American capitalism because it provides an irresistable mixture of seasonal nostalgia and pagan masquerade traditions.

These masquerade traditions, which date back to the ancient Celts, fuel an endless push to produce increasingly elaborate Halloween costumes for an American public that simply can’t wait to buy them. It’s no coincidence that Halloween’s explosion into a commercial holiday largely corresponded with the American post-World War II economic boom: Americans had more money to spend on holiday frivolities. Costumes, of course, are big among kids looking to maximize both their glucose intake and their dental bills as trick-or-treaters. But costumes are also popular among American adults looking to use Halloween’s traditional blurring of the realms of light and dark — of the living and the dead — to dress up in all manner of ridiculous outfits and carry on in carnival-style revelry each October. Halloween has become so popular among adults that Forbes magazine accused grown-ups of “hijacking” the holiday from kids.

At Halloween, costumed adults can embrace a range of normally taboo subjects such as death, sexual freedom, and every type of imaginable hedonism. Thus, Halloween, however fleetingly, creates an environment welcoming to would-be American libertines, and these normally constrained adults are willing to spend big money to achieve such temporary moments of costumed euphoria that symbolically invert traditional social norms.

Halloween’s many dualistic traditions, the contrast between living and dead; mortal and immortal; summer and fall; wicked and angelic, and rural and urban, have, in many ways, turned it into the quintessential American holiday. It mixes the ingredients of nostalgia, repressed urges, and hedonism into a potent witches’ brew that fuels American consumerism every October. Perhaps its unfortunate that Americans have turned the ancient tradition of Samhain into yet another excuse to go shopping, but such is the way of the modern world. So whatever your age, go carve a pumpkin, dress up like a ghoul, and say “hello” to the dead while you’re at it: after all, Halloween is the only time of year when the dead are the life of the party.

 * See Jack Santino, “Halloween in America: Contemporary Customs and Performances,” Western Folklore 42 (Jan., 1983): 5-8, 15-16. 

Advertisements

The Ugly History of “Makers vs. Takers” Rhetoric

This is not a good way to debate human social organization. Its just not.

This is not a good way to debate human social organization. It’s just not.

During the 2012 presidential election, Republican nominee Mitt Romney made some remarks that may have sunk his candidacy. This was nothing new for the perennial presidential candidate. After all, the guy is about as charismatic as a brick wall and has changed his political positions so often over the course of his public career that “foot in mouth disease” likely runs in his bloodline. But the comments to which I’m specifically referring were his infamous “47 percent remarks” delivered on May 17, 2012 in Bacon Raton, Florida to a table of chair-straining plutocrat donors. The remarks were, of course, captured on hidden camera by bartender Scott Prouty.

Romney’s remarks effectively divided the U.S. into two populations: the supposedly hard-working, usually rich, and always self-unaware “takers,” and the 47 percent of welfare-addicted takers who allegedly rely on government redistributive policies to siphon wealth from the “makers.” The full text of Romney’s remarks can be read here, but the “47 percent” spiel went as follows:

“There are 47 percent of the people who will vote for the president no matter what,” Romney says in the video. “All right, there are 47 percent who are with him, who are dependent upon government, who believe that they are victims, who believe the government has a responsibility to care for them, who believe that they are entitled to health care, to food, to housing, to you-name-it. That that’s an entitlement. And the government should give it to them. And they will vote for this president no matter what … These are people who pay no income tax.”

Besides the unmitigated chutzpah of a son-of-a multi-millionaire arbitrarily chastising some amorphous mass of U.S. citizens for not working hard enough while claiming that he himself had “inherited nothing,” Romney’s comments echoed a familiar idea, popular among the American Right, that crudely divides human society into camps of either productive workers or useless parasites. In recent years, this idea has been promoted in pseudoscientific right-wing literature, is routinely promulgated by utopian-craving Libertarian circle-jerk centers like Reason.com, and is spewed out by columnists like Wall Street Journal fungus-sprout, and privileged son of the affluent Chicago suburbs, Stephen Moore.

Such a simplistic division of humans into opposing “productive” and “worthless” camps, however, is nothing new. In fact, this odious approach to social organization is rooted in 19th century pseudoscientific racial thinking. The idea of “makers vs. takers” influenced the social trajectory of modern western history and, when taken to its extremes, it has provided the intellectual justification for slavery, eugenics, and, in the worst case scenario, the Holocaust. Lest the former point strike you as hyperbolic, I thought I’d take some time in this post to highlight some past examples of “makers vs. takers” arguments as revealed in some good ole’ fashioned primary source documents. These texts can help demonstrate why the “makers vs. takers” argument is despicable and dangerous.

19th century philosopher Herbert Spencer. At least his impressive chops were the fittest.

19th century philosopher Herbert Spencer. At least his impressive chops were the fittest.

Let’s begin with Herbert Spencer, the 19th century English philosopher, anthropologist, and all around tool whose unscientific application of Darwinian natural selection to human societies led him to coin the term “survival of the fittest.” Spencer adamantly opposed 19th century “poor laws,” early types of state welfare, because he believed such laws took from the “strong” to give to the “weak.” Take, for example, this excerpt from Spencer’s work Social Statistics (1851):

The poverty of the incapable, the distresses that come upon the imprudent, the starvation of the idle, and those shoulderings aside of the weak by the strong, which leave so many “in shallows and in miseries,” are the decrees of a large, far-seeing benevolence.

It seems hard that an unskilfulness which with all his efforts he cannot overcome, should entail hunger upon the artizan. It seems hard that a labourer incapacitated by sickness from competing with his stronger fellows, should have to bear the resulting privations. It seems hard that widows and orphans should be left to struggle for life or death. Nevertheless, when regarded not separately, but in connection with the interests of universal humanity, these harsh fatalities are seen to be full of the highest beneficence—the same beneficence which brings to early graves the children of diseased parents, and singles out the low-spirited, the intemperate, and the debilitated as the victims of an epidemic.

Spewing the racialist thought popular at the time, Spencer believed that some humans, like European whites, were inherently genetically superior to others, like black Africans, that were inherently inferior. He thus divided humans into “weak” and “strong” camps, and justified the disease, death, suffering, and poverty experienced by millions as natural retribution for their inherent weaknesses. Spencer claimed that the good of greater humanity depended on such “harsh fatalities,” which were, in fact, of the “highest beneficence” to humanity in general. He opposed poor laws and welfare because he believed that such laws propped up weak, inferior takers at the superior makers’ expense.

Spencer, an early supporter of eugenics, advocated sterilization to eliminate the “unfit” parasites from the earth. In the 19th and early 20th centuries, eugenics was popular among both progressive and conservative thinkers, but Spencer’s Social Darwinian theories are still popular within contemporary right-wing circles, where his delegation of the human race into “fit” and “unfit” categories appeals to those inclined towards a “makers vs. takers” worldview. Indeed, it’s no coincidence that nearly all of Spencer’s writings can be accessed for free at the website of Liberty Fund, an Indiana-based Libertarian foundation.

Moving along from Spencer, lets visit the antebellum South, where we’ll examine the famous 1858 “Mudsill Speech” delivered by pro-slavery apologist, and South Carolina senator, James Henry Hammond. Hammond was unquestionably one of the great scumbags of American history. In 1829, at age 21, he married a wealthy heiress named Catherine Fitzsimmons, from whom he gained ownership of over 100 slaves. Hammond not only sexually abused his female slaves on multiple occasions, but also molested his own nieces, a process he bragged about in detail in his own journal!

These actions stemmed from Hammond’s domineering worldview that saw women and blacks as tools for his pleasure. This idea informed his “Mudsill Speech,” through which he defended southern slavery against northern criticism by dividing society into a racial hierarchy of peon laborers and dominating owners:

In all social systems there must be a class to do the menial duties, to perform the drudgery of life. That is, a class requiring but a low order of intellect and but little skill. Its requisites are vigor, docility, fidelity. Such a class you must have, or you would not have that other class which leads progress, civilization, and refinement. It constitutes the very mud-sill of society and of political government; and you might as well attempt to build a house in the air, as to build either the one or the other, except on this mud-sill. Fortunately for the South, she found a race adapted to that purpose to her hand. A race inferior to her own, but eminently qualified in temper, in vigor, in docility, in capacity to stand the climate, to answer all her purposes. We use them for our purpose, and call them slaves.

Hammond emphasized that without a laboring “mudsill” class to do hard, manual labor without complaint, and for little compensation, civilization itself could not flourish. The existence of a permanent laboring class freed up enlightened geniuses like himself to marry rich women and pursue intellectual stimulation that would lead to cultural “refinement.” In Hammond’s racist time, better to have enslaved “inferior” blacks do the dirty work. For him and his ilk, “equality” was anathema to freedom, since the natural order of free society supposedly necessitated an “inferior” (read: black) class to provide for the economic and political security of a ruling (read: white) class. For men like Hammond, abolishing slavery entailed foisting a vast “taker” class of African-Americans onto the ruling “makers” who were busy building civilization.

James Henry Hammond: he really was a total jerk.

James Henry Hammond: he really was a total jerk.

Echoes of Hammond’s “Mudsill Theory” reverberates in modern conservative ideas about “makers and takers.” This view of society provides those who identify themselves among the “makers” with a feeling of superiority over an allegedly idle class that refuses to pull up its collective bootstraps and embrace good old fashioned labor. No matter the pittance of compensation or carefully constructed barriers to economic advancement that may come with such labor; conservatives, like those at the Heritage Foundation, bemoan the supposed “erosion of our culture of work” because such an erosion allegedly creates a parasitic class that refuses to be “mudsills,” and instead leaches from the noble upholders of American civilized culture.

But by dividing society into “makers and takers,” conservatives come eerily close to consigning the human population into “worthy” and “unworthy” classes, a type of social division that provided the ideological justification for slavery’s domination of one human group by another. Pro-slavery ideologues like Hammond dehumanized blacks as unworthy of participation in the American experiment, and modern conservatives who blithely dismiss half the population as parasites by extension deny millions of their fellow citizens their basic human dignity as legitimate members of the body politic.

The repugnant, and possibly dangerous, consequences of viewing human society as made up of “makers” and “takers” stems from a long tradition of historical beliefs that sought to categorize the human race into various types of “worthy” and “unworthy” groups along lines of genetics, race, gender, and class. The Social Darwinism of Herbert Spencer still echoes in modern conservatives’ characterization of “welfare” as undeserved “handouts” to those unwilling to work on their own. Moreover, the notion of hard-working “worthy” and idle “unworthy” classes underpinned pro-slavery arguments that inequality was essential to the upholding of freedom for the “civilized” classes.

When you arbitrarily divide human beings into “productive” and “unproductive” groups, you inherently deem the so-called “unproductive” classes as undeserving of social acceptance. Historically, this has been the first ideological step taken by those wishing to dominate other humans by controlling their labor or, in the worst case scenario, eliminating them altogether. Those who label their fellow humans as “takers” equate them to parasites, and parasites must be exterminated.

Historically, this view, taken to its utmost extremes, has resulted in the genocide of Native Peoples in America, Jews in Nazi Germany, and the Tutsis tribe in Rwanda, to name but a few examples. It’s no coincidence that 19th century American Indian Bureau officials sought to “make labor honorable and idleness dishonorable” among Indians who would otherwise starve to death following the confiscation of their hunting grounds by whites. It’s also no coincidence that the Third Reich railed against “parasitical Jews,” and that Rwandan Hutu death squads viewed Tutsis as “treacherous speculators and parasites.”

Dividing human beings into simplistic camps of “makers” and “takers” implicitly dehumanizes one group while empowering the other. The historical baggage that comes with such divisions is not something that should be exhumed from the graveyard of discarded human ideologies. Political differences are fine, and should be recognized, but let’s not lose sight of basic human dignity in the process.

Liberty and Security Forever? The Very Long Surveillance State Debate

Rogue NSA whistle-blower Edward Snowden: whether hero and villain, he's a symbol of over 200 years of American debate over balancing liberty with security.

Rogue NSA whistle-blower Edward Snowden: whether hero and villain, he’s a symbol of over 200 years of American debate over balancing liberty with security.

Are you free? No, seriously, are you really free? Do you feel that your government is protecting you from terrorists? If so, how much power should the government have to protect you from harm? Furthermore, how much power should the government have to protect itself from harm, and should you play a role in defending the state that affords you liberty and protection? These types of questions are actually quite difficult to answer if you really dig into the details of what duties and obligations come with being an American citizen.

Maintaining the supposed existential balance between liberty and security has been a hot topic in U.S. political discourse in the 12 years since the September 11, 2001 terrorist attacks, but it flared up significantly in the summer of 2013 when former National Security Agency (NSA) contractor Edward Snowden leaked classified documents to the media that detailed the NSA’s PRISM program. PRISM collects information from internet servers via Facebook, Google, Youtube and other sites. When combined with other NSA methods of collecting signals intelligence from internet companies and various communications infrastructure, the NSA effectively had the power to conduct warrantless surveillance on American citizens in the name of rooting out potential terrorists.

Snowden’s revelations caused an uproar among civil liberties advocates from all sides of the political spectrum who claimed that the NSA’s programs directly violated constitutional rights to privacy.  In response, most of the reigning political class, regardless of party, defended the NSA programs as necessary to protect American interests against terrorism. Sen. Diane Feinstein (D-CA), chairman of the Senate Select Committee on Intelligence, claimed that “it’s called protecting America,” while the House Republicans, including chairman of the House Committee on Intelligence, Mike Rogers (R-MI), held a hearing in June to defend the NSA programs as essential to national security interests. President Barack Obama has also defended the NSA, leading to accusations that he was carrying out “George W. Bush’s Fourth Term.”

The debate between liberty and security, however, is as old as the American republic itself, and presumes a possible balance between the two ideals that may be impossible to achieve. Supporters of the right to privacy against government intrusion often invoke Benjamin Franklin’s famous quote: “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” The quote has been floating through the internet ether for years, but it’s never contextualized. A while back, Brookings Institute fellow Benjamin Wittes researched the origins and context of Franklin’s quote. He found that it stems from a 1755 letter likely addressed on behalf of the Pennsylvania Assembly to protest the colonial governor’s refusal to let the Assembly tax Penn family lands to raise cash to defend the Pennsylvania frontier from French and Indian attacks. Thus, the colonial governor was a little too cozy with the Penns, and, as Wittes observes:

Franklin was writing not as a subject being asked to cede his liberty to government, but in his capacity as a legislator being asked to renounce his power to tax lands notionally under his jurisdiction. In other words, the “essential liberty” to which Franklin referred was thus not what we would think of today as civil liberties but, rather, the right of self-governance of a legislature in the interests of collective security.

So Franklin was complaining about the governor denying the PA Assembly’s rights as representatives of the people to raise money to defend the people’s interests. This is, of course, quite different from how the quote is used to comment on current “liberty vs. security” debates. You can read Wittes’ longer paper for Brookings that elaborates a bit more on Franklin’s quote here. I’m addressing the quote in this post because it’s generally used to support a black and white idea positing an attainable harmonic balance between liberty and security.

Alexander Hamilton's Federalist Party says you are an enemy of the state, you Jeffersonian scum!

Alexander Hamilton’s Federalist Party says you are an enemy of the state, you Jeffersonian scum!

Wittes thinks that no such balance is possible, and suggests instead that we consider liberty and security as “existing in a kind of ‘hostile symbiosis’ with one another—that is, mutually dependent and yet also, under certain circumstances, mutually threatening.” I think that this idea makes a bit more sense with regards to the liberty/security debate, especially when we consider two historical incidents in U.S. history, The Alien and Sedition Acts of 1798, and military conscription during World War I, where government power was controversially used in the name of protecting the country. Both incidents demonstrate that regardless of the reasons for alleged excessive government power in the name of security, these acts will always be simultaneously accused of inhibiting and upholding American freedoms.

Among the first major pubic flare ups of the liberty/security debate occurred in 1788 when, amidst an undeclared naval war with revolutionary-era France, the Federalist-controlled Congress passed four wartime laws collectively known as the Alien and Sedition Acts. These laws were designed both to thwart French influence in the U.S. and smite the Federalists’ political enemies, the Jeffersonian-Republicans, whom Federalist leaders like Alexander Hamilton accused of harboring French anarchist sympathies. The first three laws empowered the president to detain and deport any suspected enemy aliens during wartime and extended the U.S. naturalization process from 5 to 14 years. The fourth law, the Sedition Act, threatened jail times and fines to anyone caught writing, publishing, or possessing  “any false, scandalous and malicious writing or writings against the government of the United States.”

President John Adams never made use of his power to deport suspected aliens, but under the Sedition Act, 14 Republicans, primarily journalists, were prosecuted for criticizing the laws. Republicans howled, with good reason, that the Alien and Sedition Acts violated the First Amendment. In response, Thomas Jefferson and James Madison wrote up the Virginia and Kentucky Resolutions, which argued that the Federalists’ Laws gave the federal government unconstitutionally excessive powers, reaffirmed the role of states’ rights, and declared the right for states to nullify Federal laws.

Although the controversy over the Alien and Sedition Acts helped the Jeffersonian-Republicans defeat the Federalists in the 1800 election, the Virginia-Kentucky Resolutions proved controversial in their own right when southerners later used them to defend states’ rights to uphold slavery and multiple courts declared the theory of “nullification” to be unconstitutional.

Two centuries after the Alien and Sedition Acts, another long-controversial issue, conscription, again aroused debates over the balance between liberty and security. In order to field troops for American entry into World War I, Congress passed the Selective Service Act in 1917 to raise a national army via the draft. Like previous American drafts, the 1917 draft proved controversial because it often disproportionately affected the poor, who were considered more expendable as soldiers. Proponents of the draft in both the Democratic and Republican parties defended it as necessary to make those who benefitted from American freedoms defend those freedom abroad. Opponents of the draft, by contrast, criticized it as an unconstitutional affront to personal liberty.

A U.S. Draft Poster from World War I. Millions of Americans resisted despite the immanent threat posed by European ape men.

A U.S. Draft Poster from World War I. Millions of Americans resisted the draft despite the immanent threat posed by European ape men.

Resistance to the draft was particularly strong in the South, where a generation of southerners raised on Populist agrarian radicalism rejected the Great War as a tool to serve the financial and industrial interests of northeastern elites. As Jeannette Keith observes in her excellent book Rich Man’s War, Poor Man’s Fight: Race, Class, and Power in the Rural South during the First World War, “28 percent of the nation’s deserters came from the states of the former Confederacy,” and, “if southern men refused to even register at a rate that reflected their proporation of the national population,” then a half-million never even signed up for the draft.*

Keith calls World War I “the birthplace of the American surveillance state.” Through laws like the Espionage Act, the Sedition Act, and the Trading with the Enemy Act, “Congress effectively criminalized antiwar speech.”* Southerners’ evasion of the federally induced World War I draft spurred the federal government to implement widespread domestic surveillance in the South in the name of rooting out anti-war traitors. The Bureau of Investigation’s spy network targeted regular southerners in addition to pacifists, leftists, Wobblies (Industrial Workers of the World Union members), and African-Americans. Keith explains that the Bureau of Investigation’s Agents in the rural South “spent a lot of time tracing down antiwar talk and threatening dissenters until they promised to shut up. The bureau used fear to suppress dissent.”*

In 1788 and 1917 the federal government obviously lacked the infrastructural capability to reach modern NSA level spying capabilities, but the delicate balance between liberty and security was the core issue that arose with the Alien and Sedition Acts and the WWI draft.

Debates in 2013 over the NSA’s spy network continue this tradition of trying to find some kind of balance between liberty and security, but such a balance may be, by nature, unattainable. As Benjamin Wittes notes, “some surveillance…is destructive of freedom. But sometimes, the relationship between surveillance and liberty is symbiotic—that is, increasing government surveillance powers can actually be liberty-enhancing.”* The Alien and Sedition Acts and the WWI Surveillance laws were, for the most part, unconstitutional. But when the draft has been implemented for “good wars” like World War II, Americans have reinterpreted it as a “liberty-enhancing” part of patriotic duty. After all, we now call the World War II generation the “Greatest Generation,” even though they were a generation that was conscripted to fight for the nation’s greater good at their own expense.

Like the draft, how Americans feel about NSA surveillance largely depends on were we draw the line with regard to the invasion of privacy, the reach of federal government power, and the legitimacy of the stated security goal. Many supported the World War II draft in the name of defeating the twin evils of Fascism and Communism. Attitudes toward the draft turned sour, however, when is was employed to wage a perceived “unjust war” in Vietnam. Likewise, government power to spy on U.S. citizens has been criticized when used by megalomaniacs like J. Edgar Hoover, but defended in the name of upholding national security interests against international terrorism.

Whether or not the NSA’s surveillance powers are an unconstitutional affront to Americans’ freedoms will be a major subject of debate for years to come. Although the NSA programs have been credited with foiling multiple terrorist attacks, the NSA has come under deserved scrutiny for spying on American citizens in violation of its stated rules against doing so without justified suspicion.

Government power should always be viewed with a measure of suspicion, but government power can also serve a purpose: after all, the modern United States is a vast country, with a vast population and massive financial, industrial, and military interests that simply cannot be adequately protected by a technologically neutered state. Despite my seemingly infinite wisdom, I don’t have an easy answer for where the line between liberty and security should be drawn, but perhaps that’s because looking for such a line is a futile exercise. Better to recognise that the two interests are instead “mutually dependant” and “mutually threatening,” depending on the (always complicated) circumstances.

* See Jeanette Keith, Rich Man’s War, Poor Man’s Fight: Race, Class, and Power in the Rural South during the First World War (Chapel Hill: University of North Carolina Press, 2004), 2, 11-12.

*See Benjamin Wittes, “Against a Crude Balance: Platform Security and the Hostile Symbiosis Between Liberty and Security,” Brookings Institute Project on Law and Security, pg. 4.

Why Third Parties Just Don’t Work in America

A 1904 Campaign Poster for candidate Tom Watson of "People's Party," also know as the "Populists." They didn't last long, though some of their policies did. Also, Watson turned into a xenophobic, racist nutball.

A 1904 Campaign Poster for candidate Tom Watson of the “People’s Party,” also know as the “Populists.” They didn’t last long, though some of their policies did. Also, Watson turned into a xenophobic, racist nutball.

Why can’t the United States muster the will to create a viable third-party to challenge the calcified, shame-immune, institutional bureaucrat incubation pits known respectively as the Democrats and the Republicans? Throughout American history many idealistic souls have longed for a third-party alternative to the ensconced two-party system, and, despite a few fleeting exceptions, they have been sorely disappointed. The American tradition of mass democratic politics has historically combined with structural limitations within the country’s governing institutions to make third-party movements akin to knocking on Mordor’s gates and hoping to be let in with a wink and a smile. Yes, one does not simply start a third-party in America.

These facts, however, have never stopped Americans of all backgrounds and political persuasions from advocating for a third-party. Over at Time’s Swampland blog, Joe Klein is merely the most recent Prospero calling into the political tempest for a third-party to wreck onto American shores and shake up the system for the better, and he seems to think such a party is possible in 2014. Citing the candidacy of New Age guru Marianne Williamson, who is running to unseat California’s long time incumbant Democratic congressman, Henry Waxman, Klein sees a third-party image on the horizon that may prove to be more than a mirage:

Could Williamson be the harbinger of a wave of Independent candidacies in 2014? Are people so sick of the two existing parties that they’re ready to go shopping for something new? “We’re seeing this all over our polling,” says Peter Hart, who does surveys for NBC and the Wall Street Journal. “People are sick of the status quo: 60% believe that the entire Congress should be replaced. They’re looking for alternatives.”

Klein is right to point out that Americans really seem to want a third-party. The Gallup poll he cites led the Washington Times to recently declare the rise of “third-party fever,” claiming that more than ever, Americans want more political options. I have no doubt that they do. Heck, I’m one of them who wants to move beyond the bifurcated nest of incumbant morlocks currently clogging up the political pipes. But ideals do not a reality make. Americans have always wanted more party representation, but, in general, they never get it. Klein himself recognizes this fact, admitting that “I’ve been skeptical about 3rd parties in the past. The best of them–the Populists, Ross Perot (at least when it came to budgetary matters)–tend to have their hot ideas co-opted by the Democrats or Republicans.” As he notes, the idea of “co-option” explains America’s historical dearth of third parties, and why that dearth will likely continue.

America’s small “r” republican tradition of mass politics — especially since the early nineteenth century — created an environment through which various political platforms, ideas, and concepts could be introduced freely into public discourse and, therefore, be easily co-opted and absorbed by different political players. When taken in tandem with the basic mechanics of how the American political system is structured, you get a recipe for two-party blandness. As Sociologist G. William Domhoff meticulously explains, America’s  political system is based on districts and pluralities, rather than on mere proportional representation. This limits the ability of multiple parties to compete for representation and discourages the type of party coalitions common in parliamentary democracies. The election of American presidents via a direct national vote, as opposed to the parliamentary system of a victorious party choosing its leader, further dilutes third-party options.

Americans wishing to change the system to reflect proportional representation, Domhoff writes, will run smack into Article V of the Constitution, which states emphatically that “no State, without its consent, shall be deprived of its equal suffrage in the Senate.” Americans’ need to ensure that less populous states receive equal — often greater — representational clout means that a parliamentary system and, by extension, greater party variety, isn’t going to happen.

But, as I already noted, it isn’t just the American system’s design that has constantly thwarted third parties; it’s also its culture of mass democratic politics that has allowed third parties ideas to be absorbed, co-opted, and reclaimed in a system that already favors big-tent style political organizations, not fractured micro-movements. The fate of two famous American parties, the Whigs and the Populists, demonstrate why third-party movements just don’t gain much traction in a political culture as incestuous and consolidation-prone as that of the United States.

A Whig Party banner from 1848. Candidate Zachary Taylor whon the presidency.

A Whig Party banner from 1848. Candidate Zachary Taylor Whon the presidency.

The Whigs were not a proper third-party; in fact, they were, for a while, one of the two dominant American political parties, but their demise shows the power of American party consolidation. The Whigs’ political lineage dated back to Alexander Hamilton and the Federalists and reached its zenith under the stewardship of compromise-prone Kentucky politico Henry Clay. Under   Clay’s “American System,” the Whigs touted a nationalistic platform via federally subsidized infrastructure development, a national bank, and economic protectionism. For their troubles, they elected four presidents and popularized a political theory that remains a vital part of contemporary American discourse. But two primary developments: the debate over slavery and increased immigration, eventually killed off the Whigs by the mid-1850s and made way for the Republican Party’s rise to national prominence.

Originally a national party with strength in the North and the South, the Whigs began to fracture over the issue of slavery in the territories. Since the passing of the Missouri Compromise in 1820, the Whigs had gradually been splintering along pro and anti-slavery lines. This divide came to a head following the Kansas-Nebraska Act of 1854, which repealed the Missouri Compromise line and opened up the western territories for possible pro-slavery settlement under the banner of popular sovereignty. Northern anti-slavery Whigs opposed Kansas-Nebraska, while southern pro-slavery Whigs, incensed at their northern party brethren’s stance on slavery, migrated to the Democratic Party.

Immigration, especially that of Irish Catholics who, by the 1840s, were arriving in waves to the northeast’s major population centers, also contributed to the Whigs’ demise. Fears of a devious “Papist” invasion in the still largely Protestant U.S. gave rise to the Nativist, anti-Catholic Know Nothing Party. The Know Nothings threatened to attract disgruntled Whigs infected with the fever of nativism until another incipient party, the Republicans, used fears of the southern “Slave Power” to create a coalition of anti-slavery Whigs, Nativists, and Democrats that finally locked the door on the Whig mausoleum. As historian William Gienapp writes in his classic book The Origins of the Republican Party, “like the Slave Power, the Catholic Church seemed a threat to liberty, and Republican rhetoric often linked the two by warning of the dangers they posed to cherished American ideals.”* Thus, the Republican Party was able to co-opt multiple, fractured political movements into an effective big-tent party that exists to this day.

In contrast to the Whigs, the People’s Party, more commonly known as the Populists, were a true third-party. The Populist movement grew out of late nineteenth century discontent among southern and western farmers who complained of the high cost of agricultural equipment, the practices of corrupt railroad companies that overcharged small farmers while coddling big businesses, and monetary policies that encouraged endless debt. In order to lobby the state to address their grievances, an alliance of farmers formed the Populist Party in 1892. Their “Omaha Plan” called for inflationary currency, government backed subtreasuries, a graduated income tax, and state ownership of the railroads.

The Populists filled a vacuum that challenged the entrenched power of the Republicans and Democrats, but eventually fell prey to co-option by those very same parties. The white supremacist Democratic Party played on southern white farmers’ fears of racial integration to discourage any Populist alliance between blacks and whites. This racial demagoguery drew many farmers out of the Populist fold and into the Democrats’ bigoted arms. One of the most famous Populists, for example, Georgia’s Tom Watson, advocated racial cooperation before succumbing to a delusional fit of bile-soaked race-baiting, leaving a legacy so rotten that a statue of him currently residing outside the Georgia state capital is now being removed.

In addition, the Populists were internally divided over whether or not they should fuse with the two powerful major parties, who held the political clout to pass laws. The issue of “fusionism” eventually killed the Populist Party. In 1896, Democratic Party presidential candidate William Jennings Bryan co-opted much of their platform before losing to Republican William McKinley. While the Populist movement was dead by the turn-of-the-century, their legacy survived in the form of the federal income tax, a national bank, federal regulation of railroads and farm credit, and the direct election of senators — all former Populist positions that the two major parties eventually co-opted and made law. The Populists, like other American third-party movements, couldn’t survive being absorbed by the major party sponges.

Ross Perot, independent candidate for president in 1992.

Ross Perot: The independent candidate for president in 1992 who just couldn’t finish.

The co-option legacy that killed the Whigs and the Populists has resurfaced whenever third parties threaten to challenge the two-party system in America. Diminutive Texas billionaire Ross Perot, the independent presidential candidate who garnered 18.9 percent of the national popular vote in 1992 by offering up voters a country-fried mish-mash of liberal and conservative positions, eventually watched Democrats and Republicans co-opt his anti-debt, balanced budget platform. In 2000, Green Party candidate Ralph Nader siphoned just enough votes away from human flag pole Al Gore to help put George W. Bush in the White House. This scared the hell out of Nader’s Liberal supporters, thereby pushing them back into the corporate Democratic Party fold.

U.S. history shows that while there’s always potential for third-party movements to gain varying levels of steam among an electorate fed up with only two political options, the mass marketplace of American political discourse has consistently drawn third-party ideas into the major parties’ gaping maws. As the Whigs, Populists, and Ross Perot discovered, when combined with a political system that is structurally hostile to multiple party growth, American mass democracy creates a perfect storm that assures the continued dominance of the very thing that most Americans say they just can’t stand. So dream all you want folks; in the end, if you’re a Tea Partyier, you’ll vote Republican, and if you’re a bleeding heart Hippie, you’ll vote Democratic. It’s the American way, unfortunately.

* William Gienapp, The Origins of the Republican Party, 1852-1856 (New York: Oxford University Press, 1987), 372.

Slavery’s Legacy: Why Race Matters in America

A protester at a Tea Party rally holds a sign demonstrating both the continued importance of slavery's legacy in U.S. political discourse, and the continued erosion of some white folks' self-awareness.

A protester at a Tea Party rally holds a sign demonstrating the continued importance of slavery’s legacy in U.S. political discourse. Notes: this is how NOT to have a “conversation about race.”

What does it take for that contradictory, opinionated, but not always informed, ethnically amorphous mass of sputtering, super-sized humanity known collectively as the American public to have an honest conversation about race? Heck, what does the phrase “conversation about race even mean?” Henry Louis Gates, esteemed Harvard professor of African-American history, thinks it’s utterly meaningless, and that talking about race means recognizing how race is interwined with U.S. History. In an interview for Salon, Gates emphatically states that “since slavery ended, all political movements have been about race.”

This is a statement that, on its face, seems provocative. Indeed, American conservatives have for years made hay out of the idea that since slavery ended a century and a half ago, Americans, especially American liberals, need to get over it, move on, and embrace what they see as a majestic, benign American exceptionalism. Meanwhile, liberals, a group admittedly known for their propensity towards excruciating self-analysis and hand-wringing neurosis, have, in recent years, attempted to justify the value of “white guilt.” White guilt constitutes the nagging feeling that modern enlightened white people need to somehow make amends, if only through the process of realization, for their ancestors’ racist treatment of African-Americans, Native-Americans, and other minorities. Whereas white conservatives often don’t worry about white guilt, given their tendency to intentionally simplify history to the point of obtuse neglect, white liberals seemingly need to qualify every discussion about race by apologizing for the sins of the past.

Contemporary Americans of all backgrounds still struggle with the issue of race — how to define it, how public policy should reflect it, etc. — because the country’s history has largely been a painful process of self-reflection and self-denial about how race affects all aspects of American life. In his Salon interview, Gates suggests that in order to meaningfully talk about race, honest reality needs to trump feel-good triumphalism:

I’m talking about the economic role of slavery in the creation of America. The fact that the richest cotton-growing soil happened to be inhabited by five civilized tribes, what they called themselves, and that had to be exterminated, removed and or exterminated for the greatest economic boom in American history to occur. The Trail of Tears, the cotton boom from 1820 to 1860. I’m not talking about politically correct history, I’m talking about correct history.

As Gates notes, the cotton boom, among the defining events that shaped 19th century American history, happened because of slavery. Cotton was valuable. Slaves were valuable. Cotton needed harvested. Slaves could harvest it. The southern cotton belt was inhabited by native tribes, so those tribes had to be expelled to make way for plantation slavery. Plantation slavery drove the demand for more slaves. The presence of these slaves caused the Civil War. These are neither sugar-coated apologetics for American exceptionalism, nor are they full-throated demonization of the American past. Rather, they are plain historical facts, and Americans have been trying to deal with them ever since.

Harvard historian Henry Louis Gates correctly thinks that talking about race means learning about America's often sordid racist past.

Harvard historian Henry Louis Gates understands that talking about race means learning about America’s often sordid racial past.

Gates gave the Salon interview in part to promote his new PBS series “Many Rivers to Cross,” which covers the 500 year-long black historical experience in America. Slavery, of course, played a tragic, but all-important role, in that experience, for both black and white Americans. When Gates says that “since slavery ended, all political movements have been about race,” he is not claiming that a literal debate over race and/or racism is the sole driving element of over a hundred years of American political debate. What Gates is suggesting, I think, is that slavery was such an integral institution in American life from the Constitutional convention to the Civil War, that it branded the burning legacy of racial tension and discord into the American body politic. The scar from this racial branding has yet to fully heal.

The central role of slavery in founding the American republic has been well-documented by historians. In his book Slavery and the Founders: Race and Liberty in the Age of Jefferson, noted constitutional historian Paul Finkelman documents the extensive role slavery played in shaping the constitutional convention and therefore, nearly all of U.S. history. Five provisions in the Constitution explicitly protected slavery. The famous three-fifths clause counted three-fifths of all slaves for the purpose of boosting southern representation in Congress. Other provisions upheld the legality of the slave trade until 1808, when the issue could be amended, ensured that slaves were taxed at three-fifths the rate of whites,  and prohibited the states from emancipating fugitive slaves.*

As Finkelman documents, the Constitution was, for all intents and purposes, a pro-slavery document. Southern delegates at the Constitutional Convention demanded protections for slavery in exchange for its ratification, and the resulting compromise between northern and southern delegates sanctioned slavery in the country’s founding document. This compromise over slavery, however, haunted the United States for decades. By the mid-19th century, the rise of abolitionism spurred a stringent southern pro-slavery defensiveness, and arguments over the federal government’s constitutional role with regards to limiting the spread of slavery erupted into Confederate secession and Civil War.

The would-be Confederate States of America was founded on slavery. Mississippi’s Declaration of the Immediate Clauses of Secession makes this abundantly clear:

Our position is thoroughly identified with the institution of slavery– the greatest material interest of the world. Its labor supplies the product which constitutes by far the largest and most important portions of commerce of the earth…a blow at slavery is a blow at commerce and civilization. That blow has been long aimed at the institution, and was at the point of reaching its consummation. There was no choice left us but submission to the mandates of abolition, or a dissolution of the Union, whose principles had been subverted to work out our ruin.

But if the Confederate South fought to preserve slavery, the Union North was often ambivalent — if not outright hostile — to ameliorating the issue of racial subjugation that underpinned slavery. Plenty of Abolitionists saw no contradiction in simultaneously believing that slavery was immoral and that blacks were inherently inferior to whites. This fact only further highlights the tragedy of a four-year conflict that relegated half of the country to a husk of decimation, death, and rubble and claimed the lives of over 600,000 — while leaving the lingering issue of racial equality unresolved. The Civil War is the defining event that shaped modern American identity, and it is impossible to separate the issue of slavery, and by extension, race, from that event.

Slavery’s legacy runs so deep in the American psyche because it was a highly personal institution first; a major political and economic system second. The essence of American slavery was one of domination: that because of differences in skin color, whites had the inherent right to subjugate blacks, deny them human freedom and dignity, and, through violent coercion, use their laboring bodies as commodities unto themselves. This is why contemporary debates over just how “bad” slavery really was are so utterly loathsome and besides the point. At its core, it was a system that gave one group of humans total domination over another group. Slavery denied the very notion of individual autonomy, regardless of how many lashings slaves received. Given the role of social dominance psychology in human behavior, when one group of humans gains power over another, they don’t easily give up that power.

The loss of this system of economic and social domination in 1865 necessitated other methods of racial control in the minds of white supremacists. The rise of the Ku Klux Klan during the Reconstruction era sought to continue the practise of violent racial subjugation that had underpinned slavery. Likewise, the decades long wave of brutal lynchings that swept the South and parts of the North during the era of Jim Crow combined with laws relegating blacks to second class citizenship to assuage white Americans that even with slavery long gone, African-Americans should still “know their place.” During the Civil Right era, southern white supremacists carried on this crusade against “uppity blacks” via “massive resistance” to integration, while white northerners enacted de facto segregation through red lining and urban white flight. Despite these reactionary efforts, when blacks finally regained suffrage, it seemed that Americans might finally take steps to leave the ghost of slavery and race in the past.

Despite the real and positive social changes for the better that have occurred in the thirty plus years since the Civil Rights revolution, the smudge of race hasn’t been fully wiped from the American slate. Strong contemporary feelings over gun control, affirmative action, welfare, voting rights, and, as the picture at the top of this post shows, protests over tax policy, continue to invoke racially charged feelings, even as the U.S. has elected its first African-American president and its national demographics are becoming more racially mixed. Such contradictions exist because even well-meaning Americans have difficulty reconciling such idealistic concepts like “All men are created equal” with the system of American slavery and its legacy of racial conflict.

Many folks, understandably so, wonder how the U.S. can be an exceptionally good country while harboring such an ugly past with regards to race-relations, especially as embodied in the institution of slavery. The wonderment, for example, has inspired recent films that examine the memory of slavery from different standpoints.

Quentin Tarantino's ultra-bloody, Django Unchained (2012) is one way to remember slavery...

Quentin Tarantino’s ultra-bloody, Django Unchained (2012) is one way to remember slavery…

Quentin Tarantino’s unabashedly ahistorical Django Unchained takes an eye-for-an-eye, revenge fantasy approach to slavery. In Tarantino’s film, the titular slave Django (Jaime Foxx) rains down Dirty Harry style retribution against white slaveholders in a truly gratuitous cinematic bloodbath that led commenter Henrik Hertzberg to accuse Tarantino of using the memory of slavery as “an excuse for wallowing in sadism.” By contrast, Steve McQueen’s recent film 12 Years a Slave, based on the slave narrative of Solomon Northrop, shows slavery’s horrors as everyday realities, not the stuff of action set pieces. Policymic’s Elena Sheppard even declared McQueen’s movie to be the “perfect answer to Tarantino:” a film that “takes the incomprehensible violence of slavery and personalizes it,” in a way that is “gut-wrenching” and “palpable.”

These films embody two contrasting ways that many Americans try to reconcile the ugly history of American racism: it’s either something to be rejected forcefully, or its something that must be consistently examined and remembered in all of its injustice, violence, and degradation. Or, if you’re Rush Limbaugh, racism is something white people never engaged in. Nonetheless, the role of race in American society can’t be discussed in simple black and white terms (sorry, I couldn’t resist). Whether we like it or not, the legacy of slavery is central to American identity itself, even if its real effects remain controversial and painful to discuss.

How Americans of various political and social backgrounds remember and interpret their country’s history of slavery and racism should continue to be a major part of public discourse. This is why Henry Louis Gates sees education about the legacy of America’s racial past as essential to achieving a society where “you could wear your ancestry, your sexual preference, your gender orientation, your religion, your color…without penalty.” As a Harvard professor who was wrongfully arrested for trying to enter his own house while black, Gates knows better than most that such a society may be a pipe dream. But then again, that’s what some Americans used to say about an America without slavery. The best way to talk about race is to acknowledge its importance in shaping American history, learn from the mistakes of the past — and for Pete’s sake, try not to repeat those same damn mistakes.

* See Paul Finkelman, Slavery and the Founders: Race and Liberty in the Age of Jefferson (New York: M.E. Sharpe, 2001), 7.   

Ted Cruz, “Bourbon Dave,” and the Legacy of Border Ruffianism

Ted Cruz, the junior Reublican senator from Texas, likes to smite his political foes by angrily faux-filibustering. Because freedom.

Ted Cruz, the junior Republican senator from Texas, likes to smite his political foes by angrily faux-filibustering. Because freedom.

The two-week long, Tea Party Republican-engineered shutdown of the federal government is finally over. This week the Senate reached a deal that a politically battered House GOP reluctantly endorsed because it kicked the can of U.S. fiscal and political dysfunction down the road until December and February, when they can again wage scorched earth politics against all-things Obama.

Meanwhile, the horse-race junkie American political media has been focusing on the “winners” and “losers” of the shutdown. Most media outlets, save the hand-wringing experts at the Center for American Progress, have declared the Tea Party Republicans the tail between their knees losers: the victims of ideological rot and political miscalculation. Except for Ted Cruz. Indeed, the junior Republican senator from Texas — his term in the Senate barely a year old — was near universally dubbed a political winner even though his party was left with egg on their reactionary white faces.

Cruz was essentially the guy who engineered the shutdown, but he’s seen as a “winner” because he knows how to play politics: his antics of late have been 1 part ideology, 3 parts right-wing populist grifterism, and the political press has lapped it up like a wino at a brewery by declaring Cruz a 2016 presidential contender. Cruz has achieved a relatively short rise to prominence among conservative activists because he consistently tosses meaty political turkey legs to the slobbering ogres of the Tea Party base, who return the favor with generous campaign donations. In September, for example, Cruz’s near 12-hour filibuster-that-wasn’t really-a-filibuster against Obamacare tingled the Tea Party’s collective inner thighs, despite the fact that it was mere political grandstanding that couldn’t stop the implementation of the health care law.

But Cruz’s apparent disdain for the traditional machinations of party governance (how to get stuff done, in layman’s terms) has earned him the ire of senior party colleagues like Tennessee’s Bob Corker and senate Minority Leader Mitch McConnell by turning what should have been a tactical Republican fight against Obamacare into a purity test for right-wing ideology. By convincing the bloc of Tea Party Oompa Loompas within the GOP House caucus to reject a plan to keep funding the government — a plan initially backed by Speaker John Boehner and the Senate Republican leadership — Cruz stoked party infighting at a time when the GOP needed unity. While his “more conservative than thou art” shenanigans hasn’t played well with the party big wigs, it has nonetheless given the Texas senator plenty of press coverage, and even earned him the title of de-facto “leader of the Republican Party.”

Cruz’s constant pandering to the hard-line conservative Tea Party wing of the Republican base in the name of self-promotion and hard right ideology is hardly unprecedented in U.S. history. In the 1850s, another conservative southern senator, Democrat David “Bourbon Dave” Atchison of Missouri, embraced Cruz-style, play-to-the-base tactics in the name of keeping the Kansas-Nebraska territory open to slavery. Like Cruz, Atchison became the de-facto political leader and spokesman of a hard-right faction within his party: extreme pro-southern, pro-slavery settlers from Missouri known as “Border Ruffians.” Atchison, nick-named “Bourbon-Dave” due to his preference for booze that was as a strong as his temper, rallied his Border Ruffian followers via his shrewd maneuvering in the Senate and his use of swaggering, right-wing populist rhetoric that would make even Ted Cruz blush.

David "Bourbon Dave" Atchison, a staunch, pro-slavery, booze-whiskey-soaked Missouri senator who pioneered a Ted Cruz style, non-traditional, hard line right-wing approach to politics.

David “Bourbon Dave” Atchison, a staunch, pro-slavery, booze-soaked Missouri senator who pioneered a Ted Cruz style, non-traditional, hard-line right-wing approach to politics.

By the mid-1850s a storm was gathering on the western border of slaveholding Missouri that separated U.S. states from unorganized territory. By 1853, land-hungry settlers were pushing well beyond Missouri’s western border into this swath of land, spurring congress to organize it into the Nebraska territory. This raised the issue of whether that territory would be free soil or slave-holding.

“Bourbon Dave” Atchison represented the conservative, southern rights (read: pro-slavery) faction of the Democratic Party, a position that made him the enemy of his one-time fellow Missouri senator, and fellow Democrat, the anti-slavery Thomas Hart Benton. But Atchison, like Ted Cruz today, wasn’t afraid to alienate fellow party-members to serve conservative interests. The bawdy and profane “Bourbon Dave” vowed to see Nebraska “sink in Hell” before having it become free soil. Using his position as president pro tem of the Senate, Atchison demanded major political concessions in exchange for southern support of a free-soil Nebraska.

“Bourbon Dave” wrestled with fellow Democrat Stephen Douglas, the squat, hard-drinking, pugnacious Illinois senator. To please his caustic southern colleague, Douglas agreed to repeal the old Missouri Compromise of 1820, which barred slavery from the territory above the 36′ 30 line and split the territory into two sections, resulting in the Kansas-Nebraska Act of 1854. Atchison was Hell-bent on making at least one of those territories into a new slave-state, so he supported Douglas’ concept of “Popular Sovereignty,” in which settlers of the territories would decide for themselves whether slavery would be permitted in their lands. “Bourbon Dave,” representing slave-holding Missouri, hoped that pro-slavery settlers would flood into Kansas, making it a new southern slave state.

He was severely disappointed in that regard: by mid-1854, rifle-armed Free-Soil advocates known as “Jayhawkers” began pouring into Kansas to claim it for freedom. An outraged Atchison responded by calling on all pro-slavery Missourians — the “Border Ruffians” — to invade Kansas and claim it for slavery instead. The resulting outbreak of violence between Border Ruffians and Jayhawks became known as “Bleeding Kansas,” and “Bourbon Dave” helped open its political arteries.

In an apocalyptic 1856 speech to a group of Border Ruffians, Atchison, possibly aided by booze, rebuffed traditional political resolution to the Kansas problem, and instead called on his Ruffian army to wage war against the Free Soil settlers:

Yess, ruffians, draw your revolvers & bowie knives, & cool them in the heart’s blood of all those damned dogs, that dare defend that damned breathing hole of hell. Tear down their boasted Free State Hotel, and if those Hellish lying free-soilers have left no port holes in it, with your unerring cannon make some, Yes, riddle it till it shall fall to the ground.

Yes, I know you will, the South has always proved itself ready for honorable fight, & you, who are noble sons of noble sires, I know you will never fail, but will burn, sack & destroy, until every vistage of these Norther Abolishionists is wiped out.

Rough and ready pro-slavery Border Ruffians invading Kansas at "Bourbon Dave's" urging.

Rough and ready pro-slavery Border Ruffians invading Kansas at “Bourbon Dave’s” urging.

In this speech, “Bourbon Dave” encapsulated the essence of Border Ruffianism: when conservative, southern, pro-slavery forces failed to achieve their goals in Congress or at the ballot box, they resorted to non-traditional, even extra-legal methods in the name of a reactionary right-wing political ideology. These methods included violence, and the resulting violence of Bleeding Kansas raged on for eight years before the bloodshed between pro and anti-slavery forces finally exploded nationally into the Civil War. Atchison’s heir-apparent, fellow modern-day conservative southern senator, Ted Cruz, is not advocating violence. But he is taking up “Bourbon Dave’s” mantle of extreme political obstruction by pandering to a small, but ideologically fanatical right-wing base, and he’s willing to smite his own party in the name of a reactionary stance against Obamacare.

In 1856 “Bourbon Dave” proudly proclaimed, “This is the day I am a border ruffian!” to assume the leadership of slaveholders who felt ignored by a federal government that refused to recognize their human property in the territories. In a similar tone, Cruz claimed in his “filibuster” to be the populist voice of an ignored segment of America whose rights Obamacare violated:

A great many Texans, a great many Americans feel they don’t have a voice. I hope to play some very small part in helping provide that voice for them. I intend to speak in opposition to ObamaCare, I intend to speak in support of defunding ObamaCare, until I am no longer able to stand.

An analogy I have used before is, if your home is on fire, you put out the fire first before building an addition to the house. Likewise, with ObamaCare, I think ObamaCare is such a train wreck, is such a disaster that the first imperative is to stop the damage from ObamaCare.

Just as anti-slavery forces posed a liberal threat to conservative southerners’ right to dominate slaves and, by extension, the federal government, so too does Obamacare threaten modern conservatives and their corporate allies’ right to dictate policy in Washington and dominate low-income and middle class Americans by denying them health insurance. Ted Cruz has brought Ruffianism back to the forefront of American politics by demonstrating a willingness to take his reactionary ideology to the rougher edges of political discourse and maneuvering. Like Atchison and his Border Ruffians, who waged a bloody war for slavery when their cause failed at the federal level, Cruz and his Tea Party followers have waged verbal and procedural war against a federal government that no longer tows their political line. 

It’s perhaps fitting, if not symbolic, that Ted Cruz recently spoke to a Tea Party crowd, among whom was a guy waving the Confederate flag, outside of the World War II memorial in Washington. “Bourbon Dave’s” pro-slavery Border Ruffians eventually became the Confederate soldiers who fought for slavery and southern independence. Old Dave Atchison likely wasn’t on Cruz’s mind that day, but the symbolism was powerful, as a new southern Ruffian stood by the Rebel flag while denouncing the federal government and the Obama presidency. Like Dave Atchison rallying his future Confederate Border Ruffians to wage war against all things Abolitionist, Cruz was rallying the Tea Party faithful, stoking their war against all things liberal. No doubt that “Bourbon Dave” was looking up from his bar stool in Hell, and nodding approvingly.