Monthly Archives: November 2013

Abe Lincoln, cross-dressing and the American way: The real history of Thanksgiving

lincoln_thanksgiving

Its American Thanksgiving today, so to celebrate, I wrote a piece for Salon. Go check it out!

Advertisements

The “Knockout” Game, Race, and Fears of Urban Crime in American History

A standard cultural depiction of the type of riotous crime that erupted in 19th century American cities.

A  lineup of scary, urban, 19th century criminals, from Martin Scorsese’s Gangs of New York (2002).

Crime and cities have always been close bedfellows in America. The sense that cities, in contrast to the countryside, are havens of delinquency and debauchery populated by the worst kinds of morally deprived low-lifes is a longstanding notion in American culture that remains potent in the twenty-first century, even when urban crime rates are at their lowest point in some 40 years. But whatever the current level of crime in American cities, the denser populations of urban areas, when combined with the natural human proclivity towards delinquent behavior, has ensured that the cultural meme of “cities as havens of vice” has remained perennially popular.

The latest manifestation of urban crime fears is the viral panic over the supposed “knockout” trend that is currently sweeping the internet. Reports have emerged from cities such as Pittsburgh, New York, Philadelphia and others of the growing popularity of a depraved new game called “knockout” among groups of urban teenagers. As the New York Times reported, this game allegedly involves “young assailants…randomly picking unlucky targets and trying to knock them out with just one punch.” Essentially, the knockout game amounts to little more than a random, dangerous assault, since no reports of actual theft have emerged from these attacks.

Knocking someone out cold can actually be done with little excessive strength provided that the attacker clubs their unsuspecting victim in just the right spot on the head, and some of these attacks have been fatal. A homeless man in New Jersey named Ralph Santiago died after being struck from behind by a group of teens, and other victims have been seriously injured. Disparate reports of similar attacks in different U.S. cities have spread through the media, igniting a heated debate over whether these attacks are an organized criminal trend spread through social media or simply unrelated incidents of random urban assault. The New York Times, for example, questions if the viral spread of these attacks has created a modern-day urban myth, while other publications such as Slate, the Daily Beast, and USA Today have variously brushed off the knockout game as a “phony panic” lacking in sufficient data to identify a trend.

So what’s the deal here? Is the knockout game a real, disturbing national urban crime epidemic? Or is it little more than overhyped fear in which pattern-seeking humans have pieced together similar news items detailing outbreaks of the kind of spontaneous assaults that sometimes happen in big cities? In all likelihood, the answer is a potent mix of both. The knockout game does exist, but the idea that it’s a widespread, coordinated trend has all of the hallmarks of the type of urban crime myth that has long been popular in American culture.

Further, as Emma Roller of Slate notes, the fact that most of the knockout incident appear to have perpetrated by black teenagers against white victims has lent a decidedly racist tone to the whole story. Indeed, the cultural idea of the dangerous, urban black criminal is a longstanding American trope that goes all the way back to the Civil War. Of course, the usual right-wing race-baiters have used the knockout attacks as an excuse to promulgate fears of race war. They know they’re stoking old race fires, and they get paid to shovel in the coal.

Since the dawn of the Jacksonian era in the early nineteenth century, which saw the beginning of America’s long transformation from a primarily rural to a predominantly urban society, the city and its vices have been a source of potent social worry. As urban historian Paul Boyer observes, early American social reformers feared the specter of “urban decay” that sprung from cities’ dynamic demographic structure. “From the early 1800s on,” Boyer writes, “observers commented on the impersonality and bustle of urban existence, the lack of human warmth, the heedless jostlings of the free-floating human atoms that endlessly surged through the streets.”* The city’s dynamic structure led concerned reformers to conclude that urban life was a “volitile…deviation from a familiar norm” of the close-knit, morally upright, church-going, neighborly setting of the traditional American small rural town.

In response to concerns about drunkenness, crime, and general depravity in American cities, a collection of various reform organizations — including Bible and Tract societies, charity groups, the Anti-Saloon League, settlement houses, and the YMCA — sought to remake the social structure of urban spaces in the image of the morally righteous small town. These reform organizations, Boyer notes, promulgated an idea, still prevalent in contemporary American society, that cities were “seething cauldrons of licentious, brutalized creatures, contemptuous of morality, responsible to no one, owning no master but the lustful dictates of their own wicked flesh.”*

Americans today continue to view cities as havens of crime inhabited by “brutalized creatures” that are in need of a serious moral compass. Moreover, race continues to play a large role in shaping perceptions of urban social decay and violence. As the Village Voice reported in 2011, the more uncouth elements of the right-wing media complex — including websites like Urban Grounds and the ever loathsome Drudge Report — have made a cottage industry out of blaming African-American populations for crime in cities while extolling small-town (white) values as antidotes to urban (black) ills. The right-wing fringe repeats a common theme that there is something inherent in “black culture” or the “black race” that predisposes African-American urban youths towards criminal behavior like the knockout game.

Such a charge amounts to little more than pseudoscientific racial reductionism — an idea best left in the heap of past discarded social theories. Now, dismissing right-wing race-baiting does not mean that African-Americans can’t be criminals, of course. To put it bluntly: it isn’t that black people don’t commit crimes; rather, they don’t commit crimes because they’re black. Yet the perception that urban black Americans are predisposed towards crime is historically woven into the fabric of American culture.

In his book The Condemnation of Blackness: Race, Crime, and the Making of Modern Urban America, historian Khalil Gibran Muhammad traces the linkage of blackness with crime back to the aftermath of the Civil War and Reconstruction. In a nation where slavery was dead but white supremacy remained potent, “African American freedom fueled far-ranging anxieties among many white Americans” that, in turn, materialized in the idea of black criminality. The notion of black criminality spread via “national debates about the fundamental racial and cultural differences between African Americans and native-born whites and European immigrants.”* The idea that crimes committed by individual blacks are somehow representative of an urban “black community” stems from this period.

Americans have been primed to view images like this as the face of urban crime, but believe me, the gangs of white boys in 19th Philadelphia and New York would take offence at the idea that only black kids can be scary.

Americans have been primed to view images like this as the face of urban crime, but believe me, the gangs of white boys in 19th Philadelphia and New York would take offence at the idea that only black kids can be scary.

During the late nineteenth century, white Northerners and Southerners reconciled their post-war differences by emphasizing their shared whiteness in contrast to a newly freed black population that whites believed was prone to criminal behavior. In this sense, the idea of black urban criminality became a circular, self-fulfilling belief that “became one of the most widely accepted bases for justifying prejudicial thinking, discriminatory treatment, and/or acceptance of racial violence as an instrument of public safety.”* Indeed, because whites believed that black were natural criminals, blacks became criminals in whites’ minds, and the belief that blacks were criminals justified racially discriminatory laws and social customs.

The nineteenth century association of blackness with crime, Gibran Muhammad writes, remains a powerful idea in twenty-first century American culture, hence the fear of the “black thug” driven knockout game. This process of “racial criminalization” resulted in the “stigmatization of crime as ‘black’ and the masking of crime among whites as individual failure.”* To understand how this type of thinking works, put on your prejudiced cap just for a moment and ask yourself: “if white kids were perpetrating the knockout game, would I describe them as “white thugs” or just “thugs?” Of course, the idea that “white thugs” could somehow represent some hypothetical community of white people is absurd, but “racial criminalization” has resulted in violence committed by “black thugs” being taken as indicative of African-American culture. In America, white people get to be individuals, but every black person still has to represent a “black community.”

Of course, the idea that cities are more dangerous now, populated as they are with African-Americans, is historical wishful thinking. Heck, nineteenth-century American cities were infinitely more dangerous — or at least were perceived as such by contemporary observers — because they were rife with criminality, and the face of this criminality was usually white. Consider a few reports from nineteenth century Philadelphia.

Following the murder of a pedestrian, the Philadelphia Inquirer ran a December 1, 1864 report on the “Carnival of Crime” in the city at the hands of a “gang of ruffians” that “embraced every crime known to the criminal laws.” Assaults and murder were daily occurrences in he City of Brotherly Love. In September 1864, a “gang of boys” attacked an elderly woman, dragged her along the ground and broke her leg. In March 1865, a pedestrian was “badly beaten” by a “gang of rowdies” without cause. In August 1869, a night watchman was “brutally beaten” by a “band of assassins” to the point that his recovery was “doubtful.”* I could go on, but suffice to say that cities like Philly were filled with violent criminals, and those criminals were, more often than not, white.

Things got so bad in Philadelphia that a February 1870 editorial lamented the loss of the city to “hordes of ruffians” who committed acts of violence day and night with impunity. “Crime of every sort has grown frightfully familiar,” the editorial fretted, “murders are done in our most public thoroughfares and the assassins are let go free. Hanging for murder is as much a thing of the past in Philadelphia as in New York.” In nearly all of these reports, the criminals were listed as white, suggesting that cities like Philadelphia were always dangerous, and that African-Americans, shockingly, were not the only ethnic group involved in violent crime.

A standard depcition of the type of riotous crime that erupted in 19th century American cities.

A standard depiction of the type of riotous crime that erupted in 19th century American cities. Yep, those are white guys in that picture.

New York, like Philly, also had is share of violent crime issues. In his 1872 book The Nether Side of New York, the journalist Edward Crapsey described a Big Apple overwhelmed by immigration, poverty, and corruption that became “the prey of thievery and debauchery.”* Similarly, in 1886, William Howe and A.H. Hummel, the authors of Danger! A True History of a Great City’s Wiles and Temptations, warned that “in a great city like New York, the germs of evil in human life are developed into the rankest maturity.” Howe and Hummel described a New York infested since the early nineteenth century with murderous gangs of “toughs” and “rowdies” that wrecked havoc in areas like the notorious “Five Points,” a depraved neighborhood depicted in Martin Scorsese’s 2002 film Gangs of New York. Those gangs, by the way, were white. Talk about your angry “white thugs,” right?!*

Historically, Americans have long feared cities as being havens of violent criminal debauchery. The fear of urban crime goes back to the antebellum era, when the transition from a primarily rural to a primarily urban country unleashed widespread concerns that the cities were places where traditional moral values go to die. On one level, such a fear was justified: as the above sources note, cities did have lots of violent crime. But the greater population density of urban areas when compared to small towns lent credence to the idea that cities were inherently crime-prone. In fact, urban crime stemmed from the same dark side of human nature that effected every American, whether they were country bumpkins or metropolitan street rats.

Moreover, despite an American cultural tendency to associate urban criminality with blackness that has resurfaced in light of the alleged knockout trend, both whites and blacks have long contributed to urban crime. The idea that urban black criminals speak for the general “black community” is a ridiculous notion; as ridiculous as saying that rural white meth dealers in “America’s Heartland” represent the general “white community” or that Michael Corleone represents all Italian-Americans.

So if you find yourself viewing videos of the knockout game and wanting to decry the supposed degeneracy of “black thugs,” step back for a moment and consider whether crime actually has a color. Historically, urban crime has been as multi-cultural and multi-colored as American cities themselves. Recognizing that the propensity for violence lies in all humans will go a long way towards reducing crime throughout the U.S., whether that crime be urban, rural, or everything in between.

* See Paul Boyer, Urban Masses and Moral Order in America, 1820-1920 (Cambridge: Harvard University Press, 1992), 4-5.*

* See Khalil Gibran Muhammad, The Condemnation of Blackness: Race, Crime, and the Making of Modern Urban America (Cambridge: Harvard University Press, 2010), 4.

* See the Philadelphia Inquirer, “The Carnival of Crime,” December 1, 1864. “Youthful Depravity,” September 30, 1864. “Attacked,” March 6, 1865. “Another Midnight Assault Near Chestnut Street,” August 2, 1869. “The Contest Between Order and Disorder,” February 3, 1870.

* See Edward Crapsey, The Nether Side of New York (New York: Sheldon & Company, 1872), 9.

* See William Howe and A.H. Hummel, Danger! A True History of a Great City’s Wiles and Temptations (Buffalo: The Courier Co., 1886), iv, 6-11.

   

Busting the Filibuster: Some History Behind the Senate’s Reactionary Procedure

Nevada Democratic Senate Goliath Harry Reid has hurt the Right's feelings.

Nevada Democratic Senate Goliath Harry Reid has hurt the Right’s feelings by limiting their capacity for throwing tantrums.

Last week, Harry Reid, the Senate’s mousy, soft-spoken, bespectacled Mormon Majority Leader from the land of perpetual vice colloquially known as the state of Nevada unleashed his inner Incredible Hulk. The normally mild-mannered — but politically shrewd — Reid opened up the ultimate can of senatorial whoop ass by invoking the so-called “nuclear option,” a procedural act in the Senate that disregards a century of precedent by voting to end a filibuster with a simple majority rather than requiring the traditional votes of sixty senators. Reid justifiably dropped this bomb in order to overcome years of Republican filibustering of President Obama’s executive branch administration nominees.

While the drooling, milquetoast, Beltway punditocracy has decried Reid’s use of the nuclear option as an affront to non-existent D.C. civility, the Majority Leader’s move was entirely justified. After all, in an unprecedented show of political obstructionism, the grunting collective of curmudgeonly Uruk-hai known as the Republican Party have blocked well over 80 qualified nominees for various executive posts, most notably those Obama nominated for the federal appeals court.

The Republicans’ excessive use of the filibuster as a tool of reactionary obstructionism has given the old Senate procedure a bad name in the press, but really, the filibuster, and the idea behind it, has long been used by minority reactionaries to achieve their political goals in otherwise unfavorable circumstances. But what, exactly, is a “filibuster?” Teagan Goddard’s Political Dictionary offers a fairly precise answer, defining it as “an informal term for any attempt to block or delay U.S. Senate action on a bill or other matter by debating it at length, by offering numerous procedural motions, or by any other delaying or obstructive actions.” Essentially, filibusters serve to delay votes and confirmation on key Senate bills and nominations — they’re a stalling tactic.

In American popular culture, filibusters are most well known for being marathon, uninterrupted speeches delivered on the Senate floor with the goal of preventing a full vote on any given legislation. In these instances, the filibuster is a test of stamina, since the speaker can’t stop talking or even take a bathroom break during the procedure. This type of filibuster will forever be embodied by Frank Capra’s 1939 film Mr. Smith Goes to Washington, in which idealistic junior senator Jefferson Smith (played by Jimmy Stewart is full, warbly voiced glory) talks for twenty-four hours to prevent a corrupt Senate colleague from building a dam on land designated for boys’ camps. As I’ll soon discuss, such marathon filibusters have occurred, but they’re the exception rather than the rule.

The current GOP, for example, have abused the filibuster precisely because rather than staging talk-based delays of President Obama’s executive nominations — which are long, sweaty, and generally a pain-in-the-neck — they’ve instead gummed up the political system by merely refusing nominees a basic up or down vote. That’s right: there’s no long speechifying, just procedural chicanery employed by a minority party to ensure government gridlock and to prevent its opponents from implementing their agenda. Historically, different U.S. political parties have used the filibuster to their advantage, but conservatives have often employed this tool in an attempt to extract major concessions from unwilling parties or to delay political changes that threaten traditional social hierarchies. Indeed, the filibuster, regardless of who has employed it, has always been a reactionary tool.

The term “filibuster” stems from a Dutch word meaning “freebooter,” or someone who “took booty or loot;” essentially, a pirate. In American usage, the filibuster has been most often associated with the U.S. Senate, but in the nineteenth century, the word “filibuster” also referred to acts by conservative southern imperialists who sought domination of the Caribbean in the name of the Slaveholding South. As historian William Freehling notes, the most fantasy-prone of antebellum (pre-Civil War) southerners dreamed of expanding the pro-slavery South’s territory to include Cuba, Brazil, and the nations located around the Caribbean Sea, including Mexico, Nicaragua, Guatemala, Honduras, El Salvador, Costa Rica, and New Granada.*

Mississippi pro-slavery radical John Quitman. He really, really wanted to filibuster Cuba, but got...busted.

Mississippi pro-slavery radical John Quitman. He really, really wanted to filibuster Cuba, but got…busted.

To achieve their dreams of an expanded southern empire, private adventurers known as “filibusterers” eschewed diplomatic talk in favor of launching a “filibuster,” an outright invasion of a sovereign nation with the goal of fomenting a revolution, overthrowing the existing government, and annexing the territory to the U.S.* These southern revolutionaries offered to launch filibusters into Latin America in the name of Dixie.

The most successful (at least in the short-term) of the southern filibusterers was a Tennessee lawyer named William Walker. In the Spring of 1855, Walker and a band of 57 loyal “freebooters” landed in Nicaragua, then in the midst of civil war, where they were joined by local forces. The filibusterers soon captured the city of Granada, after which Walker formed a provisional government and declared himself military ruler. Walker then won the presidency in 1856, but his time as Nicaraguan strong-man was short-lived: in September 1860 he was executed by a firing squad in Honduras.

Walker was temperamentally meek and mild when compared to the pro-slavery Mississippi radical John Quitman. A wealthy Natchez planter who owned over 200 slaves, Quitman advocated for the annexation of slave-rich Cuba, a move that would add to southern wealth and southern political influence by providing Dixie with an additional slave state and its attendent U.S. senators and congressional representatives. But Quitman never matched Walker’s filibustering success. Despite raising a 1,000 man army and securing the financial backing of prominent southern planters, governors, and U.S. senators hell-bent on a Cuban invasion, Quitman ran afoul of federal U.S. neutrality laws. In May 1854, President Franklin Pierce declared that he would prosecute any southern filibustering expeditions. Quitman went to court, and his dream of conquering Cuba in the name of the South never materialized.*

The southern filibustering exhibitions of the nineteenth century were thoroughly proactive in design, but they were also ideologically reactionary. Pro-slavery advocates’ dreams of expanding the Slaveholding South’s empire into Latin America grew out of fears of a growing northern anti-slavery movement that sought to limit the spread of slavery within the continental U.S. If northern radicals vowed to halt slavery’s expansion into the American West, pro-slavery southerners reasoned, then the South would secure its own land in Latin America to expand its slave empire. In this respect, the reactionary conservatism of pro-slavery southerners led to proactive schemes to dominate other nations in the name of racial slavery.

The reactionary conservative spirit of the nineteenth century southern filibusterers lived on in the most famous of all spoken filibusters: the epic, 24 hour, 18 minute 1957 filibuster against the Civil Rights Act delivered by conservative South Carolina Senator Strom Thurmond. While Thurmond never launched an invasion of a foreign nation, his use of the filibuster to thwart legislation aimed at eventually securing equal rights for African-Americans echoed the nineteenth century filibusterers’ aims to secure the permanent status of southern slavery — based as it was on the domination of blacks by whites. Both types of filibusters, then, involved reactionary attempts by conservatives to maintain current social and economic hierarchies. Thus, the term “filibuster,” which originally characterized a piratical conqueror of foreign territory, came to define a type of piratical legislative maneuver in which a senator attempted to defeat the passage of a bill, or at least run off with some legislative booty via concessions from the bill’s sponsors.

Thurmond’s 1957 filibuster remains the longest spoken filibuster on record. He held the Senate floor from 8:54 pm on August 28 until 9:12 pm the next day. To fill the time, he read the Declaration of Independence, the Bill of Rights, and other historical documents, and before his marathon talk, he took steam baths to dehydrate his body so as to avoid running to the John mid-speech. Thurmond’s filibuster failed to stop the passage of the Civil Rights Act of 1957, but he was more concerned with making a grand statement than he was with stopping the bill. And that statement was one of vociferous protest against racial equality, a theme that directly connected Thurmond’s bloviating with the nineteenth century pro-slavery filibusterers of the southern past. The South Carolina senator did not advocate slavery, but he supported the same racist social order that had underpinned slavery before the Civil War.

Strom Thurmond: the embodiment of 20th century filibustering just to be a jerk.

Strom Thurmond: the embodiment of twentiethth century filibustering just to be a jerk.

The historian Richard Hofstadter made this connection in the late 1940s when he wrote about the long theme of racial hierarchy that connected the antebellum South to Thurmond and his Dixiecrat allies. “What makes the situation in the mid-twentieth century most similar to that of a hundred years earlier is that the doctrine of white supremacy and the state of race relations in the nation at large once again have powerful critics,” Hofstadter wrote.* In the case of nineteenth century filibusterers like Walker and Quitman, those critics were northerners who would thwart southern designs to expand slavery. For the filibustering Thurmond, those critics were agents of the federal government who sought to force racial equality on the South.

In both cases, however, conservatives reacted against threats that might upend the South’s system of racial hierarchy. As historian Nadine Cohodas notes in her book Strom Thurmond and the Politics of Southern Change, “Thurmond stoked the fires of resistance…to his constituents. He was a cheerleader for segregation, even if the cheers he led were not always couched in racial terms but in the antiseptic rhetoric of states’ rights.”* The claim of state’s rights was, of course, the same rhetoric used by nineteenth century southern filibusterers looking to shirk U.S. neutrality laws by claiming that the federal government had no legal right to interfere with racial slavery at the state level. As an aside, Thurmond’s segregationist stances didn’t stop him from fathering illegitimate children with his black servants, just as fears of “miscegenation” didn’t stop antebellum slaveholders from having offspring producing trysts with their human property.

The abuse of the filibuster by the contemporary GOP, then, is in keeping with a long tradition in which conservative minority parties in the U.S. government sought to enforce their will, or at least protest change, by mounting reactionary displays either on the Senate floor or in the jungles of Central America. Of course, the GOP is not defending slavery or racial apartheid, but they are continuing a reactionary American tradition in which conservatives sought to gum up the governmental works in the name of protesting change that might advance historically liberal agendas.

That the GOP is now strongest in the South, where it receives the most intense support for using the filibuster to block every and any appointment by the first black President, only further highlight how the past is always present, even if it can’t prevent the march of social change. No wonder Harry Reid voted to kill the filibuster: some things have to change, regardless of what conservatives think. Besides, the GOP has already vowed to use the “nuclear option” for its own benefits when they inevitably regain control of the senate. Ah, the good times march on.

* See William W. Freehling, The Road to Disunion: Vol. II, Secessionists Triumphant (New York: Oxford University Press, 2007), 145, 148, 165-66.

* See Richard Hofstadter, “From Calhoun to the Dixiecrats,” Social Research 16 (June, 1949): 141.

* See Nadine Cohodas, Strom Thurmond & the Politics of Southern Change (Macon, GA: Mercer University Press, 1993), 14.

On Liberalism: Its Faults and its Historical Necessity

Eugène Delacroix's Liberty Leading the People (1830) has long symbolized both the triumphs and failures of modern liberal thought,

Eugène Delacroix’s Liberty Leading the People (1830) has long symbolized both the triumphs and failures of modern liberal thought.

If you’ve read this blog before, you know that I’m a political liberal. I make no apologies for this stance, and have spent plenty of time on this blog critiquing conservatism as a political theory. Simply put, I think that an examination of modern history supplies sufficient evidence to prove that liberalism, despite its many flaws, remains the best hope for individual freedom and small “r” republicanism in the modern world. 

Liberalism, therefore, must be preserved and vigorously defended against the relentless conservative onslaught that, for decades, has sought to delegitimize it in the eyes of the American public. On many fronts, the Right has succeeded in doing just that, often with the unknowing aid of wishy-washy lefties who are quick to descend into hyperbolic pits of despair in moments when their ideas and policies falter. But this doesn’t mean that liberals shouldn’t critique their ideas in order to make them better and to justify why such ideas are superior to those of the Right in terms of extending freedom in America and across the globe.

In a recent article for Salon, noted liberal Andrew O’Hehir provides a brutal critique of modern liberalism via what he calls the “monumental catastrophe of the Obamacare rollout.” Crucial to O’Hehir’s critique is the notion that liberalism suffers when its ideas and policies are watered down in a futile effort to make them palatable to conservatives who are hostile to liberalism as a basic political philosophy. Hence, by eschewing any type of single-payer program and, instead, basing Obamacare on a conservative model of mandated individual private insurance, President Obama and Congress created a law that sought an unattainable balance between states’ rights and federal power. The result, O’Hehir observes, was “a complicated mishmash with dozens of brand-new moving parts.” The complexity of the health care law allowed for “red-state resistance and bureaucratic incompetence” that made the law’s initial rollout a qualified mess.

O’Hehir views Obamacare’s difficult rollout as indicative of a broader problem that reveals the still inherent weakness of liberalism in the 21st century:

[T]he fact that a Democratic president who’s perceived as a liberal and has been comfortably elected twice had to fight so hard for such a patchwork law testifies to the ideological weakness of his party, which has been dragged inexorably to the right ever since its historic schism between Cold War liberals and antiwar activists in 1968, and often appears to have no clear principles and no core constituency beyond New York lawyers and Hollywood celebrities.

This is a biting indictment of liberalism made all the more harsh given that is comes from one of its own. But O’Hehir’s critique rings true because it hits at what has always been liberalism’s strength, as well as its weakness: its commitment to equality. Unlike conservatism, which views freedom and equality as incompatible, liberalism seeks equality of opportunity in a world where hierarchies are the norm. Conservatism, at its core, is a political philosophy that defends hierarchies, whether earned or unearned, because it sees hierarchies as essential to maintaining social order. Thus, as O’Hehir notes, liberalism does itself no favors by “moving inexorably to the right,” thereby diluting its commitment to challenging hierarchical powers that pose a threat to human freedom.

Ideologically, liberalism’s commitment to equality has given it a moral edge over conservatism by providing a trenchant critique of the Right’s historical defense of social hierarchies as ends unto themselves. In the realm of policy, however, liberalism’s defence of equality has often necessitated much more universal support for liberal programs. Conservatism is a philosophy predicated on hierarchical social divisions. Rather than seeking universal appeal, conservatism needs only to gain enough popular support to bolster the power of the ruling classes. Liberalism’s task is more difficult. Because it often seeks to challenge the power of entrenched ruling classes, liberalism must gain a broader base of support from various subordinate classes that, historically, have been far more willing to either side with the ruling classes, or divide amongst themselves. Such tendencies have made broad-based political challenges to conservatism difficult to sustain, a fact embodied in Will Rogers‘ classic quip: “I’m not a member of any organized political party…I’m a Democrat.”

In the rest of this post, I’m going to further discuss liberalism’s flaws, but I’m also going to offer a firm defence of liberalism – especially in contrast to conservatism – as the political philosophy that offers the best hope for preserving both individual freedom and democracy in the 21st century and beyond.

Modern American liberalism reached its peak during the era of FDR, whose New Deal programs destroyed freedom for privilged jerks everywhere.

Modern American liberalism reached the peak of its powers during the era of FDR, whose New Deal programs destroyed freedom for privileged jerks everywhere.

Before going any further, I should at least explain what liberalism is. In his book The Future of Liberalism, political scientist Alan Wolfe defines the basic, core principle of liberalism as this: “As many people as possible should have as much say as is feasible over the direction their lives will take.” A commitment to liberty and equality underlies liberalism. As Wolfe notes, liberals want equality to extend beyond the aristocratic class or the business elite via equality of opportunity, not equality of outcomes. “Liberals,” he writes, “believe that the freedom to live your life on terms you establish does not mean very much if society is organized in such as way as to deny large numbers of people the possibility of ever realizing that objective.”*

In contrast to conservative claims that liberty can best be achieved via free markets and the absence of state intervention, liberals believe in a “positive liberty,” which holds that human flourishing should not be reduced to a series of monetary exchanges. Thus, it is not enough for a free person to be merely “left alone” by the state; a free person should also have the capacity to realize her own personal goals, and liberals are “prepared to accept state intervention into the economy in order to give large numbers of people the sense of mastery that free market capitalism gives only to the few.”* In his book American Liberalism: An Interpretation for Our Time, an excellent compliment to Wolfe’s study, political scientist John McGowan quotes philosopher Thomas Nagel to succinctly define liberalism as a philosophy favoring both ‘individual rights’ and ‘a form of distributive justice that combats poverty and large inequalities.’* Liberalism, then, recognizes that equality is absolutely essential to ensuring individual freedom and the functioning of democratic societies.

The value that liberals place on equality is what puts them at odds with conservatives, who view equality as, at best, an unachievable state, and at worst, an impediment to individual freedom because it rejects the role that organic social hierarchies play in maintaining social order. In a 1790 speech, one of conservatism’s towering thinkers, the Irish political theorist Edmund Burke, outlined conservatism’s preference for hierarchy when he described the implications of the French Revolution:

It was the case of common soldiers deserting from their officers, to join a furious, licentious populace. It was a desertion to a cause the real object of which was to level all those institutions, and to break all those connections, natural and civil, that regulate and hold together the community by a chain of subordination: to raise soldiers against their officers, servants against their masters, tradesmen against their customers, artificers against their employers, tenants against their landlords, curates against their bishops, and children against their parents. That this cause of theirs was not an enemy to servitude, but to society.*

Here, Burke outlined his core reasoning for why liberalism, as unleashed by the chaos of the French Revolution, was dangerous. Liberalism entailed the overturning of what Burke considered to be the “natural” hierarchies that held society together through a “chain of subordination.” And why did Burke consider this to be so dangerous? Because those in power, whether they be employers, landlords, or clergy, always view their rule as “natural.” They therefore view rule by the subordinate classes as an “unnatural” affront to social order. This is why political scientist Corey Robin characterizes conservatism as “a meditation on, and theoretical rendition of, the felt experience of having power, seeing it threatened, and trying to win it back.” Winning back power restores order, and conservatives view equality as a threat to order.

Edmund Burke still stands as as a hero to conservatives who defend unearned privilege.

Edmund Burke still stands as a hero to conservatives who defend unearned privilege.

The need for order was a central tenet of Arizona senator Barry Goldwater’s book The Conscience of a Conservative, one of the core texts of modern American conservatism. “The Conservative looks upon politics as the art of achieving the maximum amount of freedom for individuals that is consistent with the maintenance of the social order,” Goldwater writes. He rightly observed that freedom is impossible if one man can deny to another “the exercise of his freedom,” but, like conservatives throughout history, Goldwater postulates that state intervention in the marketplace more often than not diminishes freedom by overturning orders that conservatives view as “natural.”* These so-called “natural” orders, however, tend to be defined by those who have the power in society and, by extension, have the most to gain and maintain by describing their rule as “natural.” It’s no surprise, then, that those “natural” rulers tend to be conservative.

Liberalism has fallen short of its goals, and witnessed its greatest failures, when it has failed to convince the majority of society that their interests are not synonymous with those of the ruling few. In some historical instances, such as the Democratic Party’s failure to unite its middle class, socially liberal wing with its more traditional working-class supporters, liberals are to blame for their own messaging failures. But liberals also face a more difficult task than do conservatives: they must consistently forge broad-based coalitions in order to maintain support for their cause.

As Eric Alterman and Kevin Mattson note in their study The Cause: The Fight for American Liberalism from Franklin Roosevelt to Barack Obama, liberal unity has historically fallen prey to ugly American divisions that strengthen conservatives’ political power. “Liberals’ inability to unite the poor and the middle classes in America,” Alterman and Mattson write, “is profoundly complicated by historical circumstances – specifically the divisions of race…that continue to define so many citizens’ identities.” Liberalism’s failures, they conclude, can often be explained by the fact that people “do not generally appreciate subsidizing, through tax and transfer policies, the lifestyle’s of those they deem to be different from themselves.”*

The less-than-successful rollout of the Obamacare insurance exchanges bares such hallmarks: it is a law that could never gain broad popular support beyond its component parts. Conservatives exploited widespread fears that Obamacare would redistribute wealth from whites to “lazy” minorities, leading the president and his Democratic Congress to compromise the bill into a complicated mess that attempted to please everyone but ended up pleasing almost no one.

By compromising with conservatives, who have little interest in a functioning government that uses its power to ensure greater freedom to individuals left at the mercy of insurance companies, liberals failed to fully embrace and defend their commitment to universal equality. This does not mean that Obamacare can not work, or that the president should not make it work to benefit of all Americans. Rather, Obamacare’s rocky introduction should push liberals to defend their ideas with greater confidence, and recognize that caving to the Right’s demands in the name of short-term political gain ultimately weaken’s liberalism’s overall political hand. Americans get behind leaders who are firm in their convictions, and if liberalism is to regain its once prominent stature in American society, it must first convince voters that it has backbone. In that respect, liberals have their work cut out for them.

* See Alan Wolfe, The Future of Liberalism (New York: Alfred A. Knopf, 2009), 10-13.

* See  John McGowan, American Liberalism: An Interpretation for Our Time (Chapel Hill: University of North Carolina Press, 2007), 8.

* See Eric Alterman and Kevin Mattson, The Cause: The Fight for American Liberalism from Franklin Roosevelt to Barack Obama (New York: Viking, 2012), 465.

* See Barry Goldwater, The Conscience of a Conservative (New York: Viking, 1960), 5, 3.

Richard Cohen, Thomas Jefferson, and the Legacy of White Privilege in America

Washington Post columnist Richard Cohen. Even his beard is white.

Washington Post columnist Richard Cohen. Even his beard is white.

Richard Cohen, columnist for the Washington Post, understands something. He understands that white people have it rough. Or, at least they think that they have it rough. Some white people think that they’re losing their traditional privileges as the default ruling demographic in America. Their ensuing anger has, of late, once again lit the age-old fuse of white grievance in the United States, and numerous media outlets have spilled plenty of real and electronic ink trying to access the implications of this anger on American culture.

Richard Cohen is, like me, a white person, and he wants to understand a particular brand of grievance that motivates other white people and manifests most potently in the form of that drooling, reactionary blob of grammatically challenged rage, the Tea Party. In a recent column, Cohen pissed off a large chunk of humanity by attributing Tea Party rage not to racism, but to fear of change. Despite devoting portions of his column to mocking Tea Party rodeo clowns like Sarah Palin, many readers saw a particular paragraph in Cohen’s column as evidence of the author’s apparent sympathy for conservative white cultural dominance. The offending paragraph claimed that:

Today’s GOP is not racist, as Harry Belafonte alleged about the tea party, but it is deeply troubled — about the expansion of government, about immigration, about secularism, about the mainstreaming of what used to be the avant-garde. People with conventional views must repress a gag reflex when considering the mayor-elect of New York — a white man married to a black woman and with two biracial children. (Should I mention that Bill de Blasio’s wife, Chirlane McCray, used to be a lesbian?) This family represents the cultural changes that have enveloped parts — but not all — of America. To cultural conservatives, this doesn’t look like their country at all.

Now, Cohen has had some nasty bouts of foot-in-mouth disease in the (recent) past. This is the same guy who, earlier this month, claimed to have just learned that American slavery “was not a benign institution in which mostly benevolent whites owned innocent and grateful blacks,” but was, in fact, “a lifetime’s condemnation to an often violent hell in which people were deprived of life, liberty and, too often, their own children.” That’s right, Cohen just figured this out in 2013, and only after watching Steve McQueen’s film “12 Years a Slave,” because book-larnin’ is hard work.

But seriously, Cohen’s column stirred up a whole mess of anger because it appeared to reveal a stunning obtuseness on his part about the changing demographic face of America. Its been over sixty years since the end of legal segregation, yet Cohen admits that some Americans still have a “gag-reflex” when confronted with an interracial couple. Moreover, Cohen described the Tea Party as a group with “conventional” views, which, by default, seemed to suggest that non-Tea Partiers hold “unconventional” views. Cohen himself may or may not hold these views, though his history of writing oversimplified, bone-headed columns on the subject of race suggest that the former is possible. Plenty of people, for example, have labeled Cohen an “unreconstructed bigot” and a “racist.” But whatever Cohen’s own views, his column was, poor choice of words notwithstanding, an accurate description of Tea Party rage and the extremely potent source that fuels that rage: white privilege.

Simply dismissing Cohen as a good ole’ fashioned racist is not a particularly helpful way of discussing the kind of “indirect racism” (yeah, I just made up that term right now) that fuels modern white privilege. Liberals who call conservatives outright racists tend to get massive amounts of pushback from people aghast at being lumped together with the most theatrical and well-known symbols of American bigotry, such as the Old Confederacy, the Ku Klux Klan, and the southern lynch mob. Thus, the cycle merely repeats: liberals accuse conservatives of being racists, conservatives accuse liberals of playing the “race card;” rinse, wash, repeat.

The kind of indirect racism that animates the Tea Party, however, is less about outright hatred based on mere skin color (though it is a legacy of that idea) and more about how the truly domineering racism of the eighteenth and nineteenth centuries bequeathed the legacy of white privilege to modern-day Americans. For American whites, cultural, political, and economic dominance became common to the point of it being second-nature.

Let’s unpack that idea a bit further, shall we? There’s mounds of literature on the concept of white privilege, but let’s go with a straightforward definition: white privilege means that society affords you preferential treatment because you are white. Historian Linda Faye Williams helpfully expands on this idea in her book The Constraint of Race: Legacies of White Skin Privilege in America. Faye Williams writes that white privilege constitutes situations in which “whites display a sense of entitlement and make claims to social status and economic advantages, actively struggling to maintain both these privileges and their sense of themselves as superior.”*

In addition, white privilege tends to blind its benefactors to the very existence of their privilege. As Faye Williams notes, for many whites, “‘racism’ is a problem belonging to people of color, not to whites.”* Those who perceive their whiteness as the default, “normal” setting, and, by extension, equate whiteness with normality, often get defensive when others point out how such a stance could lead to the normalization of white racial dominance.

But, of course, such a normalization of white racial dominance is exactly what happened for much of U.S. history. Because the America was a nation paradoxically founded on the principles of equality and racial slavery, every one of its major historical events — from constitutional debates over taxation, to geographical expansion, to the Civil War and Reconstruction, to the Civil Rights Movement, to welfare reform — have, in some way, involved debates over the how the constructs of race afforded benefits to whites at the expense of non-whites.

Thomas Jefferson. Wearing a coat like that was totally a sign of privilege.

Thomas Jefferson. Wearing a coat like that was totally a sign of privilege.

Perhaps no single figure better encapsulates the reckoning with the consequences of white privilege than the undisputed Grand Poobah of American Founding Fathers, Thomas Jefferson. So hallowed a figure is Jefferson in American culture that even his biographers — who should know better — are nonetheless loathe to criticize the man for fear that recognition of Jefferson’s basic human faults would somehow negate his inherent genius and monumental accomplishments. The debate over Jefferson’s faults is at its most contentious when it comes to his views on slavery and race. Jefferson was, after all, an immensely wealthy slaveholding planter, but he also wrote about the detrimental aspects of slavery as an institution, most famously in his Notes on the State of Virginia (1788). Such writings have led many historians to claim that Jefferson was, in one form or another, anti-slavery.

However, as legal historian Paul Finkelman notes in his article “Thomas Jefferson and Antislavery: The Myth Goes On,” Jefferson’s reservations about slavery hinged less on concerns for the enslaved, and more on concerns about how slavery as an institution affected his status as a privileged white slaveholder. As evidence for this interpretation, Finkelman cites Jefferson’s famous statement about slavery: “[W]e have the wolf by the ears, and we can neither hold him, nor safely let him go. Justice is in one scale, and self-preservation in the other.”* Historians have traditionally interpreted this statement as a fear of slave revolts, but Finkelman observes that the “self-preservation” to which Jefferson alluded could also refer to his personal fortune. The labor of his slaves afforded Jefferson the good life, making the thought of losing that labor downright unpalatable.

Finkleman describes Jefferson as “compulsively acquisitive.” Indeed, on one trip to France, Jefferson  bought over 60 oil paintings, over 40 luxury chairs, 7 busts by French sculptor Jean-Antoine Houdon, multiple full-length, gilt-framed mirrors, 4 marble-topped tables, and a vast assortment of ‘items of personal luxury.’* For Jefferson, Finkelman writes, “the wolf may also have been the wolf of gluttony and greed.” Indeed, slavery gave Jefferson his lavish lifestyle, and though he may not have liked being dependent on slaves, “he did not dislike it enough to anything about it.”* Jefferson could own slaves because he was a white man and his slaves were black, and the wealth generated by his slaves allowed Jefferson to live an aristocratic life.

No wonder he couldn’t let the wolf go: slavery was predicated on the concept of white privilege — that whites were superior and blacks inferior. Jefferson was a great man, but a man nonetheless, and those men (or women) placed into positions of power by the normalization of dominance over others are seldom in a rush to give up such a privileged status.

Got white privilege, America? You bectha' we do.

Got white privilege, America? You bectha’ we do.

Jefferson’s struggles with the moral implications of white privilege echo in the contemporary musings of people like Richard Cohen, who run into trouble when they casually brush off the type of indirect racism created by centuries of American white privilege. To be sure, the Tea Party types about whom Cohen writes are not racist in the same vein as the cross-burning Klansmen or the angry lynch mobs of decades past. Rather, like Jefferson and millions of whites before them, segments of the Tea Party have been simmering in the soup of white privilege for so long that they don’t even recognize that an earlier form of racial dominance helped make the base of that soup. Thus, you don’t need to be a flaming racist to defend cultural norms that were forged in a far more racist past.

American conservatives genuinely fear the consequences of losing their white privilege. Slavery is obviously no longer the issue, but slavery’s legacy has, as Linda Faye Williams writes, long resulted in the “unequal allocation of educational resources, substantial insider networks that funnel good jobs largely to whites, and social policies that deliver more generous benefits to whites.”* These are the modern fruits of white privilege.

It’s no coincidence that, according to a recent Democracy Corps study, the Republican Party’s Tea Party base “are very conscious of being white in a country with growing minorities. Their party is losing to a Democratic Party of big government whose goal is to expand programs that mainly benefit minorities.” Just as Jefferson feared losing the white privilege that created the luxurious life of an eighteenth-century planter, the modern Tea Party fears losing the white privilege that has long directed the benefits of social programs and political power disproportionately into the hands of American whites at the expense of non-white minorities. Richard Cohen, I think, understands this fear, but he also, on some level, identifies with it, which helps explain the befuddlement that he and others express when charged with racism. To paraphrase a particularly plain-spoken white guy, “It’s the whiteness, stupid.”

* See Linda Faye Williams, The Constraint of Race: Legacies of White Skin Privilege in America (University Park, PA: The Pennsylvania State University Press, 2003), 10-11.

* See Paul Finkelman, “Jefferson and Antislavery: The Myth Goes On,” Virginia Magazine of History and Biography 102 (April, 1994): 205.

The real history of the “war on Christmas”

Charlie Brown and Linus Van Pelt witness the commericalization of Christmas in the form of alluminum mass-produced Christmas trees.

Charlie Brown and Linus Van Pelt witness the commercialization of Christmas in the form of aluminum, mass-produced Christmas trees. In an attempt to push back against the sanctification of mass consumption, Charlie Brown opts for a small wooden tree, and gets called a “blockhead” for his troubles.

If you think that the idea of Christmas commercialism is something new, then you haven’t checked out the 19th century recently. Follow this link to Salon where I discuss why the “War on Christmas” is utterly bogus.