Monthly Archives: January 2014

Death, Technology, and the Rise of Steel: Why Workers Matter in American History

Workers in the Mahoning Valley’s iron mills came in a variety of ages, including small children, as seen in this early 1870s photo of Brown, Bonnell & Co.’s nail mill crew.

Workers in Ohio’s Mahoning Valley iron mills ranged from old guys to small children, as seen in this early 1870s photo of Brown, Bonnell & Co.’s nail mill crew.

Publisher’s Note: Today we’re doing something a bit different. This is a guest blogpost by Clayton Ruminski, a specialist in the rise of iron and steel production in Northeast Ohio’s Mahoning Valley. I grew up in the “Valley,” so this post is totally local history for me, but for those of you unfamiliar with the tragic story of the rise and fall of Ohio’s once glorious steel industry, this post will provide some much-needed context about how workers built America.  

Northeast Ohio’s Mahoning Valley, and in particular the city of Youngstown, is one of America’s poster children for de-industrialization, desolation, and the general lack of an economy. Heck, it even inspired a Bruce Springsteen song. But there was once prosperity in this buckle of the American Rust Belt. The valley was affectionately known as the “Steel Valley,” but there is a general ignorance as to how this region became one of the most dominant steel producers in the United States.

For one thing, the transition from iron to steel was tempered in the blood of immigrant laborers who suffered terrible fates in the name of capital and production values. So how did the ghastly deaths of “insignificant” and “expendable” Welsh, German, and Irish immigrants working in the Mahoning Valley’s small iron furnaces and mills steer the economic and industrial prominence of the once prosperous “Steel Valley?” In short: their deaths spurred industrialists to invest in technological advancements that increased safety in the mills, but also diminished the need for skilled labor in the process. Steel workers built America, but at a steep personal cost.

Throughout the 19th century, a more suitable name for Ohio’s Mahoning Valley would have been the “Iron Valley.” In contrast to nearby Pittsburgh, where the magnate Andrew Carnegie began steel production as early as 1875, large-batch steel production did not occur in the Mahoning Valley until 1895. There is a major difference between iron and steel. Iron straight from the blast furnace – a tall, fire-brick lined, steel-plated cylindrical structure that used temperatures of over 1,000 degrees F to smelt iron ore, coal, and limestone into molten iron – was brittle. This weak iron had to be refined into stronger wrought iron in a highly skilled process known as puddling, during which a puddler and his helper used a long iron prod to work 600 pounds of iron for over an hour in front of the intense heat of a puddling furnace.

: A puddler and his helper remove a 150-pound, near molten ball of wrought iron from a puddling furnace at Youngstown Sheet & Tube’s Campbell Works in the 1920s.

A puddler and his helper remove a 150-pound, near molten ball of wrought iron from a puddling furnace at Youngstown Sheet & Tube’s Campbell Works in the 1920s.

Steel had different qualities. You needed iron to create steel; a much stronger metal that could be produced in mass quantities and at a cheaper price than that of skilled labor-intensive wrought iron. Whereas about a ton of wrought iron could be made in a puddling furnace in a twenty-four hour period, ten to twenty tons of steel could be made in a Bessemer converter in a period of twenty minutes while using only semi-skilled and non-skilled labor. In 1892, Youngstown industrialist Henry Bonnell illustrated this disparity between processes when he stated that, “the Mahoning Valley contained 477 puddling furnaces employing 954 puddlers, 954 helpers, and 236 roll hands. Together, these furnaces and their workers produced 1,050 tons of wrought iron per day.”* The same year, Youngstown industrialists proposed a single steel plant that could produce 1,000 tons of steel billets per day with a workforce of only 200 men.

Thus, the stage was set in for the elimination of skilled labor and iron production in the Mahoning Valley. Large-scale manufacturing of steel loomed on the horizon with premonitions of cavernous mills, massive blast furnaces, the glorious pyrotechnics of the Bessemer steel converter, and even a safety movement.

Before the age of steel, however, much of the Valley’s labor force endured horrifying working conditions. Toiling in these hellish antebellum and post-Civil War iron furnaces was a foreboding and often dangerous job. James Davis, former puddler at the Sharon Iron Works and Secretary of Labor from 1921 to 1930, regarded these mills as a veritable hell: “Life in these mills is a terrible life; men are ground down to scrap and thrown out as wreckage, he wrote.” Former blast furnace operator Ralph Sweetser said it best in his 1938 book, Blast Furnace Practice: “many men were killed or maimed by blast furnace accidents, accidents that were terrific and horrible.”*

Yet despite these horrid conditions, the industrialists and those running the mills unsurprisingly cared less about immigrants’ welfare and more about money and production values. In the mid to late 19th century, there were no unions, and attempts at unionization by workers often resulted in immediate termination and replacement with another immigrant worker paid no more than $1.30 per day. Work was expendable and turnover was high. In the Mahoning Valley, dreadful accidents at the mills occurred at an alarming rate. Accidents from boiler, furnace, and molten iron explosions occurred almost regularly, and death by falling objects such as poisonous gas, faulty equipment, and extreme temperatures around the furnaces claimed the lives of many laborers.

Plenty of cringe-worthy accidents occurred at iron companies in the 19th century Mahoning Valley. One of the primary culprits was boiler explosions. In September 1872, the newly installed boilers at Brown, Bonnell & Co’s rolling mill in Youngstown – one of the largest iron mills in the country at the time – exploded. When the engineer stopped his engine, “the explosion took place with a terrible concussion, tearing the boiler house literally to pieces, and throwing immense pieces of boilers with terrible force in every direction.” The boiler tender was immediately killed, while a one ton piece of boiler flew a quarter-mile, crashing into an unsuspecting family’s home and crushing a mother and her new-born baby.

The company immediately rebuilt the boilers and resumed production in a week. Six years later, an eight year old boy named James Cobb was “cooked alive” when he fell into a hot vat of water at Brown, Bonnell & Co.’s Falcon blast furnace boilers. In the 1850s, a red-hot piece of iron burned through a twelve-year old boy’s thigh, catching his clothes on fire and severely burning his entire body. In 1880, laborers David Evans and Frank Patton were killed by a boiler explosion at the Tod furnace in Youngstown, leaving five others severely injured.*

1-3: The deadly result of the 1872 Brown, Bonnell & Co. boiler explosion that killed an innocent mother and her child.

The deadly result of the 1872 Brown, Bonnell & Co. boiler explosion that killed an innocent mother and her child.

Other macabre accidents occurred all too regularly. In 1889, Charles Myers, a roller at the Youngstown Rolling Mill, was crippled for life when rain came into contact with exposed molten iron, causing a large explosion. In 1887, Griffith Phillips, an engineer at the Hubbard, Ohio Rolling Mill, was oiling an ore-crushing machine when his clothes became entangled in the cog wheels, dragging him in. “He was mangled out of all semblance of humanity, the flesh adhering to the cogs,” a newspaper reported. Such horrific accidents were, for the most part, of little concern to the managers and industrialists who ran the mills.*

Many mill owners only shut down the facilities for a brief period to remove mangled and burned carcasses, and they often restarted operations within days or even hours. Take, for example, an instance at Andrews & Hitchcock Iron Co.’s furnaces in Hubbard, Ohio in 1899. A blast that resulted in a great expulsion of gas and flame from the top of the furnace caused Patrick Moore – a top filler who dumped iron ore, coal, and limestone into the top of the furnace via wheelbarrow – to be blown from the top, falling seventy-five feet onto a large iron pipe. In response to the accident, company president Frank Hitchcock stated, “it will be necessary to close down the furnace for a period of about 30 days, which will entail considerable additional loss, as we were very busy.” He later went on to mention that the company’s chief regret was the loss of life.

For most companies, of course, the chief regret was not so much loss of life, but the resultant down time that transpired from such accidents. At the turn of the 20th century, this became an increasing concern, as accident rates continued to climb due to faulty equipment, horrid working conditions, and old methods of labor that could have been replaced by safer, more mechanized procedures. There were stark contrasts in the chemical composition of iron and steel, which entailed different approaches to the process of manufacturing the two metals. Iron was primarily manufactured by hand, hence the term wrought; meaning worked into shape by artistry or effort. Steel, on the other hand, was, by necessity, much more mechanized due to the large-scale rate of its production.

1-4: The old method of casting iron at the Mabel blast furnace in Sharpsville, Pennsylvania, c. 1915. Workers faced intense heat, poisonous gasses, and sometimes explosions of molten iron.

The old method of casting iron at the Mabel blast furnace in Sharpsville, Pennsylvania, c. 1915. Workers faced intense heat, poisonous gasses, and sometimes explosions of molten iron.

As steel came to dominate the Mahoning Valley at the turn of the 20th century, newly formed companies such as Republic Iron and Steel and the United States Steel Corporation demoted their iron production in favor of more modernized mills and large-batch steel manufacturing via the Bessemer converter. As a result, industrial technology progressed. Modern blast furnaces became safer through the brilliance of engineers such as James Gayley, Marvin Neeland, and others. Modern methods for rolling steel eliminated the old way of manually rolling the red-hot metal while also mechanizing the process, and Bessemer converters eliminated the old method of small-scale, labor intensive puddling.

Deaths in the mill meant money lost, and by the first two decades of the 20th century, big steel corporations began implementing safety measures to reduce downtime and raise the morale of their workers through welfare capitalism. But these measures occurred slowly, and accidents still continued at an alarming rate between 1900 and 1920, resulting in twenty-four states adopting workman’s compensation by 1915. In Youngstown, Carnegie Steel Co. modernized the old Union Mills, which began operations following the Civil War, by installing fans, sanitary systems, steam cranes for handling heavy steel, and even a police force.

Due to steel’s late appearance in the Mahoning Valley, however, many older iron mills still remained. Liquidation of these dangerous and outdated plants occurred slowly, as companies gradually built their modern steel mills to replace them. In 1905, a 20-ton ladle of molten iron from the Struthers, Ohio furnace gave way and fell into a pool of water, causing an explosion so immense it could be heard ten miles away. One Slovenian worker was never found; another laborer had his skull crushed by an unfastened pipe; and yet another was covered with molten iron from his feet to waist, though he miraculously survived. Accidents like these occurred more frequently in the smaller, independent iron companies that remained as outgrowths of the 19th century mills. Yet as Big Steel consolidated and further reduced competition from independent companies, 19th century carry-overs such as the Struthers furnace were largely diminished in importance.

Companies such as Republic Iron and Steel and U. S. Steel looked to maximize profits and force their competitors by the wayside, and this necessitated a happy workforce that could go to work each day without the fear of, you know, dying. Imagine the mental toll of seeing a co-worker “literally roasted alive,” in a pool of molten iron, as the Youngstown Vindicator described in 1899. Such experiences weighed on the minds of laborers who worked at avoiding such horrific accidents during eight to twelve-hour daily shifts. Although most companies still failed to recognize unions in the early 20th century, Big Steel acknowledged that accidents and deaths in the workplace slowed profit gain and production values. This recognition partially informed their decision to modernize the steel facilities; a process that also reduced the amount of skilled labor in the mill.

Rollers at Sheet & Tube’s Campbell Works demonstrate the arduous job of continuously passing the puddle ball through rolls, shaping it into long, flat bars.

Rollers at Sheet & Tube’s Campbell, Ohio Works demonstrate the arduous job of continuously passing the puddle ball through rolls, shaping it into long, flat bars.

As unions gained strength in the early 20th century and demanded better working conditions and a proper living wage, so too did Big Steel respond with welfare capitalism and improved working environments. But these gestures were done primarily out of their own selfish interests. In retrospect, some have viewed the unions’ demands in the late 1930s and 1940s as leading to the eventual downfall of the steel industry in Youngstown, Ohio. A common criticism is that unions stifled the steel companies’ ability to spend money in order to modernize and survive competition from a globalized economy, but these accusations are short-sighted.

Ironically, Big Steel spent the money to modernize their mills in the early 20th century in the face of competition and the demands from workers and government for better working conditions. In the 1970s, however, a time when union power was already significantly reduced and still waning, Big Steel failed to modernize in the face of even stiffer international and domestic competition, leaving the Mahoning Valley a mere shadow of its former self; a symbol of de-industrialization and the Rust Belt.

Despite the gruesome deaths of Welsh and German laborers in the 19th century, it was concern for maximizing profits, not concern for workers’ lives, that spurred the transition to steel production via modernized blast furnaces and steel plants. Industrialists were motivated by profits, and if a content workforce and modernization could lead to higher profit margins, then so be it.

It was America’s immigrant laborers, however, who paid the steepest price in the name of industrial progress: their maimed, torn, and roasted bodies turned steel plants into largely forgotten memorials to a labor force that often sacrificed their lives on steel’s molten altar. Their work transformed Ohio’s Mahoning Valley into a giant of steel production, and their efforts toiling in the old iron forces and puddling mills are a testament to the human price of industry. In modern America’s decidedly anti-worker cultural environment, the efforts of Ohio’s industrial laborers remind us that workers are America, and their efforts shouldn’t be forgotten.

* Clayton Ruminski has a master’s degree in Applied History and Historic Preservation from Youngstown State University. The majority of his research focuses on the effects of big steel corporations on independent iron mills through consolidation and technological advancements. He works as a library and archival specialist in Warren, Ohio.

* See James Davis, The Iron Puddler: My Life in the Rolling Mills and What Came of It (Indianapolis: The Bobbs-Merrill Company, 1922), 87.

* See Ralph Sweetser, Blast Furnace Practice (New York: McGraw-Hill Book Company, Inc., 1938), 278.

* See The Vindicator, May 30, 1892; October 25, 1905; The Evening Times, June 1, 1899; The Ohio Democrat, May 14, 1887; Somerset Herald, June 7, 1889; Perrysburg Journal, May 7, 1880; Weekly Register, November 22, 1882;Western Reserve Chronicle, September 4, 1872.

“Lone Survivor” and the Historical Legacy of Violence and American Militarism

Mark Wahlberg stars in "Lone Survivor:" a violent ode to 'Murica.

Mark Wahlberg stars in “Lone Survivor:” a violent depiction of the Afghanistan War. This conflict has surpassed the Vietnam War in terms of sheer length and ambiguity.

Americans are a violent people. Whether in a wartime or civilian context, we like to shoot guns, and we are good at killing people with those guns. This is an indisputable fact. The U.S. has by far the highest rates of gun ownership in the industrialized world, and, as the Washington Post reported shortly after the brutal Sandy Hook massacre in late 2012, the U.S. is only outranked in terms of gun violence by developing nations in South Africa and South America.

Many Americans unfortunately view violence as the go-to solution for all kinds of vexing problems. Historically, this has always been the case, and this obsession with firearms shows no signs of letting up in the 21st century. Indeed, a good many Americans take gun worship to a bizarrely fetishistic level. You can almost picture any number of the country’s self-proclaimed gun nuts spending their Friday nights hung from ceiling chains while wrapped in shiny leather and stroking one of their 300 AR-15s with scented oils.

American gun-nuttery begets an entire culture of violence that affects both domestic and foreign affairs. By mixing a jingoistic belief in American cultural superiority with an already insane domestic devotion to the proliferation of firearms, the U.S. has created a Frankensteinian, militaristic cultural monster that has reaped much bloodshed over the decades.

The prime characteristics of American cultural militarism are its embracing of violence as a means to an end, its idolistic bowing before anything with a trigger and ammunition, and its belief that America can do no wrong. Over the last few decades, American culture has become increasingly militarized both on a foreign and domestic level. The militarization has become so strong that even sensible gun regulation fails to become law, and the American military is seen in some circles as an unassailable institution, rather than as a collection of individuals who are to be admired and respected, but not unconditionally worshipped.

Consider the recent snafu over the film “Lone Survivor,” a war epic starring Mark Wahlberg that’s based on a memoir by former Navy Seal Marcus Luttrell; the lone survivor of an Afghanistan mission that went bad. L.A. Weekly film critic Amy Nicholson lambasted “Lone Survivor” as “a jingoistic snuff film” that drains all nuance from the Afghanistan conflict in order to create a “Rambo”style war porn spectacle that espouses simplistic notions of American Exceptionalism and military superiority. “These four men were heroes,” Nicholson writes, “but these heroes were also men. As the film portrays them, their attitudes to the incredibly complex War on Terror…were simple: Brown people bad, American people good.”

Part of Nicholson’s critique centered on the fact that Luttrell’s memoir was heavily ghost written by British novelist Patrick Robinson, who may have added more Taliban fighters to the story than were present during the actual events. This created additional scenes of violence that helped to spice up the film (Hollywood has long called this tactic dramatic emphasis). Nicholson was criticizing the film rather than the real-life soldiers upon whom the movie was based. But in right-wing American circles, criticizing the military, in either a real or fictionalized context, is considered grounds for extreme chastisement. Hence, when radio slime-ball Glenn Beck got wind of Nicholson’s criticism of the film, he went on the air and called her “a “vile, repugnant, and ignorant liar.”

Beck is nothing less than a shameless sycophant who built a multi-million dollar media empire by feeding gullible conservatives a steady diet of paranoia mixed with simmering white person resentment. So when his listeners heard that Nicholson had criticized “Lone Survivor,” they responded in a manner befitting of today’s right-wing jerk menagerie. As Salon reports, Beck’s minions went on Twitter, the world’s preeminent outlet for conflict resolution, and called Nicholson, among other things, a “military hating bitch.” Over at Beck’s website, one of the many commenters claimed that Nicholson meant to “demean the service of our soldiers,” a move that said commenter found “beyond words for me.” This online brouhaha over a movie lays bare the danger inherent in American militarism: it sanctifies violence as the highest form of patriotic expression, and it demands, in true authoritarian style, that the military be above criticism.

Since the colonial era, gun violence has been intimately linked to American national identity, a connection that has costs hundreds of thousands of lives.

Since the colonial era, gun violence has been intimately linked to American national identity, a connection that has costs millions of lives.

The idea that the American military should not be critiqued, lest critics face alarming accusations of treason and even death threats, is the byproduct of American militarism. On the domestic side, this trend manifests itself in a truly irrational cultural bias that favors the right to own and operate nearly any type of firearms without restriction. You don’t have to be a hippie pinko peacenik to support some gun limits: even most gun owners support background checks. But such has been the militarization of American society on all fronts that even basic gun regulations are viewed by the Gestapo/NRA as assaults on American freedom itself. The idea that guns and the military are above critique is a belief rooted in the regenerative power of violence — that violence can create rights out of wrongs. Hence, gun nuts think that a only “good” person with a gun can prevent a “bad” person with a gun from committing violence, and neo-conservatives think that American military force can “fix” foreign countries.

Unfortunately, the regenerative power of violence, and the type of gun-worshipping militarism that it produces, is as idea with deep historical roots. On the domestic side, blame the frontier. In a previous post I discussed how the frontier nurtured American gun culture, but its influence can’t be overstated.

In his book Regeneration Through Violence: The Mythology of the American Frontier, 1600-1860, historian Richard Slotkin identifies frontier violence as a key component of American national mythology. Early American culture was shaped by the notion that the New World, populated as it was by “savage” native peoples who didn’t know how to utilize its bounty, had to be “liberated from the dead hand of the past and become the scene of a new departure in human affairs.”* Guns were the preferred tools of this “liberation,” as the American frontier became a killing ground in which white Americans nearly obliterated the nation’s native past to bring about that “new departure” that became the United States. Slotkin reminds us that violence was integral to this transformation. The idea that Americans “tore violently a nation from the implacable and opulent wilderness” by killing the Indians who were “the special demonic personifications” of that wilderness are “the foundation stones” of American historical mythology.*

When the regenerating power of violence transformed the frontier from “savage” outpost to “civilized” America, it also shaped an American notion that frontiers of various kinds must constantly be subdued with violence in order for the U.S. to retain its supposed moral and cultural superiority. Way back in 1970, the late American historian Richard Hofstadter wrote that “there is far more violence in our national heritage than our proud, sometimes smug, national self-image admits of.” Hofstadter argued that recognizing the American propensity for violence was crucial: “In our singular position,” he observed, “uncontrolled domestic violence coincides with unparalleled power, and thus takes on a special significance for the world.”* This statement is nothing if not prescient today. The modern militarized culture creates new frontiers out of urban crime areas, sites of mass shootings, and pesky foreign countries where gun-carrying Americans must regenerate the U.S. through violence, both at home and abroad.

Consider our current cultural unwillingness to view American overseas military endeavors with a more critical eye. As historian Susan Brewer writes in Why American Fights: Patriotism and War Propaganda from the Philippines to Iraq, U.S. government leaders have consistently sold their war aims to the general public by packaging them as “official narratives” of propaganda. “The official narratives,” Brewer notes, “have presented conflict as a mighty clash between civilization and barbarism in the Philippines and Word War I, democracy and dictatorship in World War II, freedom and communism in Korea in Vietnam, and…civilization and terrorism in Iraq.” These “official narratives” draw on a long tradition in which Americans have used violence to assert their alleged cultural superiority via “the message that what is good for America is good for the world,” and it is this type of militaristic thinking that has, over time, created America’s distinct culture of violence.*

The result has been the seeping of cultural militarism into all aspects of American life to the point where it even influences reviews of war movies like “Lone Survivor.” U.S. soldiers, nay, the military itself must not be criticised, because to criticize the military is to criticize America, which is above criticism. Taken to its logical extreme, this type of thinking threatens to ideologically reshape the U.S. along the lines of a military junta; a type of government that has committed some of the worst atrocities in human history, from Argentina, to Chile, to MyanmarPolitical scientist (and Vietnam veteran) Andrew Bacevich calls this development the “New American Militarism,” in which “misleading and dangerous conceptions of war, soldiers, and military institutions…have come to pervade the American conciousness.”* 

U.S. soldiers do their best with the near impossible tasks they've been given in Iraq and Afghanistsan. But while sometimes violence is the answer, more often than not it begats more violence in a never-ending cycle.

U.S. soldiers do their best with the near impossible tasks they’ve been given in Iraq and Afghanistan. But while sometimes violence is the answer, more often than not it begets more violence in a never-ending cycle.

The militarization of American society, both in the military and civilian spheres, is a dangerous development that threatens the fundamental identity of the U.S. a small “r” republican nation. Militaries are, by their very nature, authoritarian, deeply hierarchical institutions. This is why they are good at protecting nations but bad at ruling them: authoritarianism and democracy don’t mix, which is why the U.S. (for now) has civilian control over its armed forces. But the problem runs deeper than mere soldier worship. A highly militarized society is also a paranoid society that will inevitably degenerate into an irrational orgy of circular violence in the name of regenerating its supposed previous greatness. Such societies are also intolerant of dissent, incapable of rational argument, and paralyzed by the limited options presented by itchy trigger fingers.

The U.S. today finds itself at this particular crossroads. After fostering a culture of gun violence born in the frontier and nurtured in countless wars, both domestic and foreign, official and unofficial, America in 2014 cannot seem to wrest itself from the idea that violence solves all problems. Thus, no matter how many school kids are blown away with assault weapons; no matter how many brown people are ripped to shreds overseas; and no matter how many American soldiers are killed or maimed in the name of the American empire, a militarized society ensures that there will always be those willing to defend to the death their right to own a bazooka and watch movies like “Lone Survivor” without criticism. So strap on your concealed carry holsters folks, ’cause its gonna be a bumpy ride.

* See Richard Slotkin, Regeneration Through Violence: The Mythology of the American Frontier, 1600-1860 (Norman: University of Oklahoma Press, 1973), 3-4.

* See Richard Hofstadter and Michael Wallace, eds., American Violence: A Documentary History (New York, Alfred A. Knopf, 1970), 4.

* See Susan A. Brewer, Why America Fights: Patriotism and War Propaganda from the Philipines to Iraq (New York: Oxford University Press, 2009), 4.

* See Andrew Bacevich, The New American Militarism: How Americans are Seduced by War (New York: Oxford University Press, 2005), xi.

“Job Creators” and the Echo of Slaveholding Republicanism

A 19th century southern "job creator" rests comfortably on his porch while one of his dutiful employees looks on with great reverence.

A nineteenth century southern “job creator” rests comfortably on his porch while one of his dutiful employees looks on with great reverence.

Greetings fellow plebeians. Have you done your patriotic duty lately and courteously genuflected before our great nation’s sacred bestowers of all things employment based? Yes, I of course refer to that most noble, industrious, ultra-rich, and all around better-than-you group of Americans referred collectively by that oversized chamber pot known as the political pundit industry as “job creators.” If you have not yet shown due and expected deference to these money-swollen lords of society, then I suggest you do so quickly; for you see, the “job creators” are angry, and when they get angry, they refuse to cast their magical, job-creating spells like so many disgruntled Hogwarts rejects.

Why are the “Job Creators” angry, you ask? Well, allow me to enlighten you. Some audacious lower members of society, especially Senator Elizabeth Warren (D-Soviet Union) have been unduly chastising our glorious plutocratic overlords. Warren has used her relatively brief Senate position to attack Wall Street with a combination of old-time progressive populist rhetoric and actual legislation that would reign in the casino-like excesses of the financial services industry. The “Job Creators” are not pleased.

Bloomberg’s Zac Bissonnette, for example, recently accused Warren of harboring “disdain for the private sector” by having the gall to propose legislation that curtails hiring discrimination based on credit scores. Moreover, a few years’ back, the ignominious trolls over at the Libertarian unicorn production factory known as claimed that Warren “wants America to be more like Communist China.” The Peoples’ Republic of China, of course, has been an authoritarian capitalist society for a while now, but whatever. These types of accusations from American conservatives are bolstered by an idea that they believe to be fundamental to American society: that the wealthiest members of society must be respected, nay, worshipped by the smudge-faced commoners because they hold the power, through their control of capital, to create jobs for everyone else. In short: piss them off and they’ll “go Galt.

At the heart of the “job creators” mentality is that wealth in its own right amounts to virtue, and that virtue, by extension, must be acknowledged. Thus, the “lesser” members of society who aren’t wealthy should be all-too-happy to accept the scraps of benefit free, low-wage Wal-Mart greeter and Wendy’s Frosty incubator jobs that the “job creators” create for the proletarian masses.

The term “job creators” itself is a concoction by right-wing marketing nematode Frank Luntz, a guy whose stock-in-trade is using gentle euphemisms to make conservative policies sound less odious than they actually are. To plenty of folks, “jobs creators” sounds better than terms like “oligarchs,” “leeches,” “tax cheats,” well, you get the idea. Luntz is also the guy who refashioned the positive sounding “inheritance tax” into the negative sounding “death tax” in order to convince Americans that not taxing the inherited wealth that spawns “job creators” like Paris Hilton actually benefits average people.

Remeber back when President George W. Bush's massive tax cuts caused the "job creators" to create jobs? Neither do I.

Remember back when President George W. Bush’s massive tax cuts caused the “job creators” to create jobs? Neither do I.

Plenty of Americans — though not a majority — have bought into the idea that the wealthy must be given constant policy tongue baths via lower taxes, less regulation, and exemptions from giving minimum wage and health benefits to employees, lest they pack up their money and leave the country. No matter that, despite the flagging economy, corporations have been sitting on huge piles of cash without hiring workers for the last few years. And no matter that the ultra-wealthy don’t really create jobs. The idea of “job creators” is a myth that won’t die because it has historical precedent. Looking back to the nineteenth century slave-holding South, the notion of due deference being shown to the wealthy in exchange for labor benefits was a key argument favored by pro-slavery ideologues.

Defenders of American slavery cast their arguments in terms designed to appeal to the broader white masses, in both the North and South. They argued that racial slavery ensured social harmony because it made all free white men equal in their shared superiority over enslaved blacks. In this respect, wealthy slaveholding planters demanded the support of their “peculiar institution” from the majority of white southerners who didn’t own slaves on the basis that slaveholding created wealth that benefitted all of society — not just the rich. Not only were slaves valuable property in-and-of themselves, but their labor was valuable as well. Thus, wealthy planters and their defenders cast themselves as the “job creators” of the Old South. They told poor and even middle class whites to support slavery because doing so always made them better off than black slaves, who did the dirtiest, hardest work in society.

This form of proslavery advocacy eventually coalesced into a complete ideology known as “proslavery republicanism.” While republicanism entailed a government based on the will of the people through equal popular representation, proslavery republicanism stated that only white men were guaranteed participation in a republican society. As historian Stephanie McCurry notes, “the modern slave republic was defined above all else, as its defenders never tired of saying, by the boundary that separated the independent and enfranchised minority from the majority of dependent and excluded others.”* The “excluded others” were black slaves (and women). Proslavery republicanism ensured all white men that even if they were poor and beholden to the South’s planter oligarchs, they were still, in comparison to black slaves, “equal” to the rich, and should therefore be content with their lot or else work to improve their condition.

Statistically speaking, the average white southerner had little hope of reaching the planter class via slave ownership, but the oligarchs reaped the rewards of wealth and power from keeping such a dream alive among the common white South. The lower classes were less likely to rebel if they believed that they could one day enter the planter class — even if that hope was mostly a pipe dream.

Proslavery republicanism was, therefore, the glue that held the old southern socio-economic order together, despite its many internal fractures that often revolved around issues of class division. Perhaps the most famous proslavery defense came in the form of the so-called “Mudsill Speech” delivered by South Carolina slaveholder, and all-around slime ball, James Henry Hammond. I wrote about Hammond’s speech in a previous post, but it bears repeating. Hammond, a guy who admitted in his journals (!) to molesting his own nieces, defended slavery on the grounds that:

In all social systems there must be a class to do the menial duties, to perform the drudgery of life. That is, a class requiring but a low order of intellect and but little skill…it constitutes the very mud-sill of society and of political government; and you might as well attempt to build a house in the air, as to build either the one or the other, except on this mud-sill.

The “mud-sill” class to whom Hammond referred were slaves, and he and other advocates of proslavery republicanism argued that even white Americans of lower classes should support slavery, and the planter oligarchies it created, because they benefitted from creating a permanent “mud-sill” class over which they could claim social and economic superiority. This was a form of trickle-down economics, in which wealthy planters courted, and often got, the support of less well-off groups based on the assumption that what was good for the rich was good for everyone (or at least all white men). Did it work, you may ask? May I point to exhibit A: the Confederate States of America, a short-lived proslavery republic in the name of which thousands of slaveholders and non-slaveholders alike died during the Civil War.

If James Henry Hammond were alive today, he'd tell you to support tax cuts for the rich on the basis that "hey, at least you ain't a mudsill." He'd also want to spend some time with your nieces.

If James Henry Hammond were alive today, he’d tell you to support tax cuts for the rich on the basis that “hey, at least you ain’t a mudsill.” He’d also want to spend some time with your nieces.

Historian Larry Tise observes that proslavery republicanism was “a system of values and beliefs that reconciled for Americans the inevitable conflict between the nation’s revolutionary ideals and the facts of enslavement.”* So powerful were these contradictions that we’re still dealing with the fallout from slavery to this day. Hence, the major underlying theme of proslavery republicanism; its emphasis on due deference to the rich in the name of the greater good, survived the demise of slavery and continued to resurface in latter-day conservative defences of all forms of social and economic privilege. This defense of privilege is the same notion that underpins the contemporary right-wing American obsession with worshipping wealthy “job creators.”

So the next time someone warns you about the supposed dangers of taxing the super-rich, or waxes apocalyptic about the alleged detrimental effects of raising the minimum wage, remember that in America, we used to claim that slavery ensured freedom. The maintenance of peasant-and-lord style mentalities like the “job creators” myth doesn’t help those who are unemployed in a nation swimming in wealth, and its bodes ill for the continued vitality of truly equal, small “r” republican governance. In the U.S., wealth has always equaled power, and if you think the so-called “job creators” are standing up for anyone’s interest but their own, then I suggest loading up your Confederate musket, ’cause your gonna’ need it.

* See Stephanie McCurry, “The Two Faces of Republicanism: Gender and Proslavery Politics in Antebellum South Carolina ,” Journal of American History 78 (March., 1992): 1246.

* See Larry E. Tise, Proslavery : A History of the Defense of Slavery in America, 1701-1840 (Athens: University of Georgia Press, 1987), 361.

Unemployment Insurance and the Southern Roots of Modern Conservatism

During the Great Depression of the 1930s, the unemployed occasionally recieved donuts and coffee, while the GOP deemed them parasitic moochers.

During the Great Depression of the 1930s, the unemployed occasionally received donuts and coffee. The GOP, of course, deemed them parasitic moochers.

If there’s one thing that characterizes the pit of drooling, addle-brained wampas known as the 133th United States Congress, it would be inactivity. Dominated as it is by the Republican Party faction of obnoxious brats known as the Tea Party, the so-called “Do-Nothing Congress” and its only mildly less insane Senate counterpart is once again engaged in the now traditional ritual that involves deciding whether or not long-term unemployment benefits should be extended.

Republicans in the House and Senate are, as in the recent past, opposed to unemployment insurance, and the welfare state in general, on ideological grounds. For example, arch-conservative Wisconsin rep., and failed vice-presidential candidate Paul Ryan claimed on the 2012 campaign trail that welfare policies of all kinds had “created and perpetuated a debilitating culture of dependency, wrecking families and communities.” Indeed, the idea that millions of Americans take advantage of welfare as an incentive to simply not work is standard dogma on the American Right.

Lest they be seen as stingy, stone-hearted Scrooges, however, Republicans have fallen back on their tried-and-true defence of slashing unemployment benefits by claiming that they only care about saving money. As the Washington Post’s Greg Sargant reports, “Republicans want to reframe this as a battle over how to pay for extending benefits…as a fight over fiscal responsibility, not over whether to preserve the safety net amid mass unemployment.”

Of course, conservatives’ claims to fiscal hawkitude are belied by the long history of atomically exploding deficits under their watch. Salon’s Brian Beutler rightly observes that Republicans are instead invoking fiscal restraint to conceal their “opposition to or reluctance to support the benefits themselves without obtaining some kind of conservative policy concession in return.” Thus, Republicans’ current whining about the cost of extending unemployment insurance is more smoke and mirrors designed to conceal their beliefs that welfare is detrimental to the very fabric of society.

As with most of their antics in the age of Obama, however, modern Republicans’ opposition to unemployment insurance is in keeping with a tradition of conservative thought that, in an American context, has deep roots in the antebellum (pre-Civil War) South. In his book The Southern Tradition: The Achievement and Limitations of an American Conservatism, Eugene Genovese, the sometimes brilliant, and often controversial, late historian of the antebellum South identified the crucial elements of order and hierarchy that were central to an American conservatism forged in a 19th century slave society. According to Genovese, true southern conservatism was not defined by any slavish devotion to rampant individualism and the unfettered capitalist free market, but by “a belief in a transcendent order or natural law in society as well as nature, so that political problems are revealed as essentially religious and moral.*

This is not to say that conservatives don’t actually like capitalism. Indeed, Genovese makes no such claim. But he does suggest that conservatives support capitalism only insofar as it successfully perpetuates the hierarchies and unequal power relations that they belief are essential to maintaining true order in society. Hierarchies, whether in the home or in the marketplace, are defined by power relations; by those who dominate and those who are dominated. Conservatives, unsurprisingly, have defended ordered hierarchies because they are usually the ones dominating. The American right wing views this set of social and economic hierarchies as “tradition” that must be preserved at all costs. Hierarchies are preserved by order, and maintaining order is essential to conservatives’ preservation of power.

Simon Legree, the sadistic southern slaveholder in Harriet Beecher Stowe's Uncle Tom's Cabin, strove to preserve "traditional" hierachies.

Simon Legree, the sadistic southern slaveholder in Harriet Beecher Stowe’s Uncle Tom’s Cabin, strove to preserve “traditional” hierarchies.

As Genovese explained, “‘Tradition’ is…understood as an embodiment of ‘givens’ that must constantly be fought for, recovered in each generation, and adjusted to new conditions.”* These “givens” that conservatives view as constituting tradition are almost always characterized by the maintenance of power by a ruling few over more numerous subordinate groups. When the subordinate groups threaten the ruling few’s power by demanding agency over their own conditions, conservatives, the beneficients of “traditional” power structures, get angry and fight back like spoiled toddlers.

The slave society of the antebellum South, in which one (white) ruling group held unlimited power, sanctioned by the state in both private and public spheres over another (black) laboring class, was the perfect breeding ground for this type of “traditional” conservative power structure. And even after southern conservatives fought and lost the Civil War in the name of preserving the slave system, the core hierarchical tradition over which they battled continues to be the primary motivator of American conservatives in the 21st century.

Historian Leo Ribuffo notes this continuity by explaining how the New Deal energized a conservatism that had steadily been losing ground in almost all matters except race since the turn-of-the-century. The New Deal, Ribuffo writes, “undermined ‘traditional values’ by incorporating working-class Catholics, Jews, cosmopolitan intellectuals and occasionally African-Americans.” In response to the New Deal’s challenging of traditional social and economic orders that favored employers and white southern men, northern Republicans and southern Democrats formed an “anti-New Deal ‘conservative coalition'” that laid the groundwork for the conservatism of the modern GOP.*

This is why the GOP’s current opposition to extending unemployment benefits in the name deficit hawkishness rings utterly hollow. They could care less about deficits, but they do care deeply about preserving “traditional” hierarchies that grant full power to employers over their workers.

Libertarian historian Thomas Woods, Jr. makes this point abundantly clear when he invokes Irish political philosopher, and conservative Grand Poobah Edmund Burke to defend traditional social orders. “Traditionalist political and social thought focused primarily on preserving what Edmond Burke called the ‘little platoons’ of civilization, all those associations – e.g., family, church, town, civic group – that gave people social identities and prevented them from dissolving into an undifferentiated mass,” he writes.* By insisting that the ‘little platoons’ of civilization must be maintained, Woods Jr. is, by extension, demonstrating the conservative proclivity towards hierarchical orders that subordinate some members of society to rulers who, not coincidentally, are usually conservatives.

The Congressional GOP, along with their Senate allies, don't want to extend unemployment benefits because they're jerks.

The Congressional GOP, along with their Senate allies, don’t want to extend unemployment benefits because they’re jerks.

Burke’s “little platoons” have historically been sites of unequal power structures. Especially in an American context, families have traditionally subordinated women to men, while churchs, towns, and civic groups have been ruled by white men who dominated everyone from women, to blacks, to the poorer classes. After all, the most vociferous advocates for slavery in the Old South were preachers, local politicians and businessmen – all of whom where white – and all of whom ruled over lower groups. What Woods Jr. and other conservatives advocate via “little platoons” are multiple public and private hierarchies dominated by – you guessed it – conservatives. Tradition indeed.

Since the Civil War, capitalism has done a good job of preserving hierarchical traditions that favor conservative rule. But when the lower orders attempt to challenge, or at least mitigate, the inequality-breeding tendencies of capitalism, as in the case with issuing unemployment insurance in the midst of the Great Recession, the Right Wing strikes back…hard. When conservatives wax nostalgic about “simpler” times and invoke “tradition,” they are really yearning for a past during which their ilk held greater power of society’s lower orders, which included workers, women, children, and non-white minorities.

Unemployment benefits threaten conservatives’ vision of “traditional” social order by providing relief and agency to workers who would be otherwise left to the brutal whims of a business-favoring economic system that demands the total subordination of employees to employers. Conservatives’ opposition to unemployment benefits stems from their paternalistic worldview, in which the rulers must maintain order over the less-deserving masses in society.

Hence, as the Washington Post recently noted, a Republican memo is making the rounds for the purpose of coaching House GOP members on how to be empathetic to the “personal crisis” experienced by unemployed Americans. Tellingly, the memo doesn’t acknowledge the legitimacy of unemployment insurance. Rather, it only states that Republicans should feign concern for the unemployed masses. This is paternalism at its most loathsome. That conservatives need to be coached on how to show empathy for workers speaks volumes about how they view society. By opposing unemployment benefits, and any form of welfare in general, Republicans are keeping alive a conservative tradition, nurtured in the Old South, that seeks to preserve social hierarchies – and conservative rule – in the name of social order.

* See Eugene Genovese, The Southern Tradition: The Achievement and Limitations of an American Conservatism (Cambridge: Harvard University Press, 1996), 22, 4-5.

* See Leo P. Ribuffo, “The Discovery and Rediscovery of American Conservatism, Broadly Conceived,” OAH Magazine of History 17, Conservatism (Jan., 2003): 5.

* See Thomas E. Woods, Jr., “Defending the “Little Platoons:” Communitarianism in American Conservatism,” American Studies 40 (Fall, 1999): 128.

Phil Robertson, Duck Dynasty, and the Historical Legacy of Southern Manhood

Duck Dynasty's Phil Robertson trades in southern identity tropes.

“Duck Dynasty’s” Phil Robertson trades in southern identity tropes.

A few weeks back, Phil Robertson, the hirsute, camo-sporting, duck pelting patriarch of the hit A&E “reality” series “Duck Dynasty” nearly gave the internet a pulmonary aneurism when he expressed, shall we say, less-than-enlightened views about gays and African-Americans.

In a rather revealing interview for GQ, Robertson, a self-identified “bible thumper” who “just happened to end up on television,” claimed that the so-called normalization of homosexuality nurtures a culture in which “sin becomes fine.” Robertson claimed that when you “start with homosexual behavior,” a host of other vile forms of sexual immorality follows suite, including bestiality and rampant poliamory. Robertson even paraphrased Corinthians to assert that “the adulterers, the idolaters, the male prostitutes, the homosexual offenders, the greedy, the drunkards, the slanderers, the swindlers” wouldn’t “inherit the kingdom of God.”

Robertson’s comments elicited a predictable and entirely justified pushback from LGBT organizations and other groups. His remarks proved so controversial that A&E briefly suspended Robertson from his own program before reinstating the bearded celeb following an outcry from right-wing coach potatoes who view “Duck Dynasty,” as I noted in an article for Salon, as a reassuring beacon of religious conservative values in an entertainment wilderness populated by alleged Godless liberal hedonism.

Robertson’s views on gays are hardly surprising — coming as they do from an old, white, male, southern religious fundamentalist. After all, Deliverance aside, Bubbas have never been outwardly comfortable with buggering. But in the same interview, Robertson also made some dumbass comments about African-Americans. As the Atlantic’s Jonathan Merritt noted, Robertson expressed what amounts to a mind-blowing ignorance of the horrors of Jim Crow: the South’s historical apartheid system that relegated blacks to second-class citizenship for a hundred plus years following the Civil War.

‘I never, with my eyes, saw the mistreatment of any black person. Not once,’ Ole’ Phil claimed. ‘The blacks worked for the farmers. I hoed cotton with them. I’m with the blacks, because we’re white trash. We’re going across the field…. They’re singing and happy,” Robertson continued. He concluded by affirming that he ‘never heard one of them, one black person, say, ‘I tell you what: These doggone white people’—not a word!… Pre-entitlement, pre-welfare, you say: Were they happy? They were godly; they were happy; no one was singing the blues.’

The Atlantic’s Ta-Nehisi Coates appropriately summed up Robertson’s comments as evidence of the lingering American belief “that black people were at their best when they were being hunted down like dogs for the sin of insisting on citizenship.” Indeed, Robertson’s combined contempt for gay people and apparent ding-batted belief that blacks were “happy” under American apartheid echoes a long tradition linking white southern manhood to the concept of “mastery” that dates back to the nineteenth century and still reverberates today.

In their now classic collection Southern Manhood: Perspectives on Masculinity in the Old South, historians Craig Thompson Friend and Lorri Glover note that mastery involved southern men’s internalization of “a sense of manliness through relationships to wives, children and slaves by subverting challenges to white male authority leveled by these dependents and by heading autonomous, self-sufficient households.” This type of “masculine mastery” was also known as “Paternalism” or “Patriarchy,” and the maintenance of mastery depended on white southern males’ socially sanctioned dominance over less powerful groups, especially blacks.* The idea that blacks were “happy” under Jim Crow is rooted in the old concept of mastery because such a sentiment rests on the assumption that any deviations from the model of blacks as happy workers and whites as benevolent rulers challenged long-established southern hierarchies.

Although mastery was most internalized by the elite planter class, common white southerners, including “rednecks,” “crackers,” and “po’ white trash” of all kinds also subscribed to the notion of mastery. Doing so allowed them to claim, via their whiteness and domination over blacks, a shared kinship with wealthy white southerners in much the same way contemporary non-rich conservatives worship wealthy “job creators” out of a discredited hope that some of the modern oligarchs’ riches will trickle down to the obedient plebeians.

“Duck Dynasty” represents the mainstreaming of the commercial redneck brand.

The idea of mastery as a hallmark of white male southern identity largely, but not entirely, fell by the wayside after the Civil War, when the demise of slavery meant that mastery in its most literal form was no longer a hallmark of Dixie’s culture. But the concept of mastery has retained a stubborn influence — albeit reshaped by changing historical circumstances — on the construction of white southern male identity in the twenty-first century. In the contemporary world, homosexuality is gaining increasingly mainstream acceptance and an uppity black has reached the plateau of uppityness by becoming President. Thus, the old concept of mastery has adapted to the times to forge a non-politically correct creature; a creature who stands in proud defiance against cultural liberalism and creeping secularism: the modern commercial redneck. Phil Robertson is that redneck, and millions of Americans sympathize with his plight.

Scholars have been documenting the mainstreaming of commercial conservative redneckness for some time now. In his excellent study White Masculinity in the Recent South, historian Trent Watts writes that in still conservative twenty-first century America, “national audiences eagerly consume the redneck and good old boy repackaged as a blue collar man who is familiarly southern” in addition to being “hard-working, pragmatic, patriotic, and good-humored” while simultaneously eschewing the explicit, outward trappings of racism that defined white male mastery in the Old South. “No longer marginalized as either a rustic clown or savage hillbilly,” Watts observes, “the ‘blue collar’ man has become in the eyes of millions the most solid and patriotic of Americans.”*

But the media-created “blue collar” man, as evidenced by Phil Robertson and his cohort of modern commercial rednecks on “Duck Dynasty,” is less an organic creation and more a pre-packaged southern good ole’ boy brand. The commercial redneck is portrayed in mass media not by actual working class people, but by millionaires like Phil Robertson, Larry the Cable Guy, and others. The modern commercial redneck became an icon by taking the desire for mastery that defined white manhood in the Old South and reshaping it into a weapon to wield in the contemporary culture wars. Media-generated conservative rednecks like Phil Robertson are therefore less threatening and more mainstream than the patriarchs of the southern past, but they’re still interested in mastery of some sorts. Indeed, the modern commercial redneck is deeply concerned about retaining mastery over popular culture.

Millions of conservative Americans rallied to support Phil Robertson by eating deep-fried chicken gizzards, because freedom!

Millions of conservative Americans planned to rally in support of Phil Robertson by eating deep-fried, mechanically-separated chicken parts, because freedom!

By targeting gays and criticizing supposed black welfare fraud rather than calling for outright segregation, the redneck as portrayed by Phil Robertson offers a last stand in defense of mastery and the hierarchies created by white male privilege. Millions of Christian Conservatives, despite the fact that most Americans subscribe to Christianity in some form, feel persecuted by an onslaught of gayness and secularism, and they look to Phil Robertson to defend their way of life.

Robertson can retain cultural mastery over minority groups like gays and Democratic-voting blacks by verbally disparaging them in a mainstream publication and therefore diminishing their claims to mainstream cultural acceptance. His many Republican-voting, mega-church going, Chick-fil-A patronizing followers then vindicate his attempts at mastery by lining up lock-step in support of their bearded, reality t.v. Moses, to whom they look to lead them out of the Egypt that persecutes Christian Conservatives and into the Promised Land of shredded safety nets, low taxation, dinner-table prayers, and firearm proliferation.

Historically, this Promised Land has been the American South, and while the South as a region has never been immune to change, for white southerners especially, change has entailed a loss of mastery over various minority groups. They’ve therefore embraced change only with a fair amount of kicking, screaming, or ranting in GQ.

* See Craig Thompson Friend and Lorri Glover, eds., Southern Manhood: Perspectives on Masculinity in the Old South (Athens: University of Georgia Press, 2004), ix.

* See Trent Watts, ed., White Masculinity in the Recent South (Baton Rouge: Louisiana State University Press, 2008), 6.