Apr 192018
 

Well before he became a mass murderer, Timothy McVeigh was a boy who liked guns. He would shoot with his grandfather as a youngster and joined the National Rifle Association in the mid 80’s when he got his hunting license. Later, he joined the Army, where he had access to many more weapons. Those who served with him described him as obsessed with guns – this from a group with a generally higher interest in guns than most of the population. McVeigh had begun collecting guns by then and, reportedly, was subsequently in the habit of sleeping with one. He stashed them about, wherever he lived and smuggled them on base while serving. A potential love interest dropped him because she found his all-consuming focus on guns tiresome. Guns were his passion, and he told fellow soldiers that he was concerned that the government would take them away.

After leaving the Army, McVeigh continued to work with guns – as a security guard and, then, selling them on the gun show circuit. He became active in gun-rights advocacy, sharing reading materials with those he knew, mixing with ideological extremists, and writing letters to the editor. He sent Rep. John LaFlance of his home state of New York a letter in an envelope stamped with the advertising slogan: “I am the NRA.” That was in 1992; later, he would disavow the group because he thought it hadn’t done enough to stop gun legislation that passed soon after.

In 1989, a man named Patrick Purdy had turned an AK-47 on schoolchildren at Cleveland Elementary in Stockton, CA. He killed five of them – all under the age of ten – and wounded thirty-two others. Purdy’s gun was a Chinese make, and President George H.W. Bush signed an Executive Order banning the import of assault weapons from China thereafter. Five years later, President Clinton would sign another Executive Order limiting imports of guns and ammunition from China. By then, Congress had responded to public safety concerns by mandating background checks through the Brady Act (1993) and passing the Federal Assault Weapons Ban (1994). The NRA was unable to stop passage of these bills, succeeding only in getting a ten-year expiration inserted in the prohibition legislation.

McVeigh was already riled then by the disastrous enforcement actions in Ruby Ridge, ID (1992) and Waco, TX (1994), which he saw not simply as excessive enforcement leading to an inexcusable loss of lives. His rants against government tyranny weren’t about the 4th Amendment or Due Process. His concern was the threat to gun sales and possession. The assault weapons ban, he told his comrades, was the last straw. It was time to “take the fight to the enemy” with a retaliatory strike against ATF agents and others engaged in gun law enforcement. He saw himself as a crusader for the sacrosanct ownership of guns.

Of course McVeigh had a weapon on him as he fled the scene of the bombing. It was not then legal to carry a handgun on you concealed, which prompted his arrest when a patrolman pulled him over for a motor vehicle violation. “My gun is loaded,” he warned the officer. “So is mine,” Trooper Charlie Hanger responded. In the end, the gun regulation did get McVeigh. He missed the effective date of legislation authorizing concealed carry by just a few months.

The NRA that he thought was too ineffective had been working for some time to get state laws passed allowing carrying handguns in public. In the 90’s and first decade of the 21st century, it successfully pushed concealed carry legislation. Since then, the NRA agenda has included the ability to carry a weapon openly and eliminating licensing and permit requirements. Twenty-three years after the bombing, citizens around the country travel in public spaces free to carry guns where McVeigh could not. Far from eliminating gun ownership – thanks largely to NRA lobbying – governments at the state and federal levels have made it more permissible to introduce weapons into public spaces.

McVeigh didn’t live to see the expiration of the assault weapon ban, which has reintroduced sales of guns like the AR-15 that Nikolas Cruz used for his shooting spree two months ago at a high school in Parkland, FL. McVeigh owned an AR-15 too. NRA lobbying efforts have thus far blocked another ban on such guns – despite mass shootings in Colorado, Connecticut, Nevada, and any number of other places over the years. Instead of the gun-free society that McVeigh feared, our country is awash in handguns, rifles, and variants of deadly weapons. He didn’t need to martyr himself; the cause has been ascendant.

m[-_-]

Mar 042018
 

This is not a review.

In general, I gave up reading history books some years ago. The academic ones are so little worth the effort, and I am no longer compelled to “be up on the secondary literature” as I am no longer member of the discipline, so it’s not a necessary requirement to skim them assess their interpretations anymore. I’ve been training myself in the art of historical story-telling — something my academic training was at cross-purposes with — so if I have any interest in reading historical non-fiction, it’s of the popular history varieties. I’m doing research in how non-academics put together their stories.

My most recent reading was Killers of the Flower Moon — a book that has garnered some praise and is apparently in the running for non-fiction book awards. Author David Grann is a journalist, and I was curious to see how he would approach the history. Also, the subject is one that I have peripheral familiarity with as I am from Oklahoma and have studied some Native American history.

Gliding through the book, I couldn’t escape the feeling that the story was disappointingly shallow. The superficial tale was noticeably researched, but there wasn’t a complexity or depth behind it providing a backdrop for the story. There was so much history about Oklahoma and the time period that could have been included that would have enriched the story greatly. For example, at one point Grann recounts the Osage’s desperate appeal to Senator Charles Curtis for help and protection. How much more poignant would that plea seem if Grann had developed Curtis as a character, exposing his assimilationist positions?

And, there was more. Grann seems to accept at face value that Governor Walton was legitimately impeached for corruption (thus supporting Grann’s points about impropriety in government in the state), but those knowledgeable about the subject understand the role of Walton’s anti-KKK activities in his removal from office. In a place where the governor is vulnerable to conspiratorial schemes for taking action against racial violence, what hope had the Osage — a racial minority — for justice?

Less ominously, why wasn’t Frank Phillips, founder of Phillips (66) Petroleum and a rich, powerful figure in state, a resource for aid for the people who granted him honorary membership in their nation? Grann describes the oil barons flocking to Pawhuska to bid on oil leases, but beyond that, he doesn’t examine their roles at all. Phillips was a noted collector of Native American art and artifacts as well as drilling rights, so his absence in the mystery is glaring. His failure to intercede is telling, for someone who regularly welcomed Native leaders to his ranch and celebrated territorial society. A historian approaching the subject would delve into such inquiries and recognize the value of interweaving these and other historical points (these few are just examples) into the story. Therein lies the difference between a journalist and a historian approaching the subject.

I don’t mean to single Grann out here. Indeed, the superficiality in his book is something I find common in history books penned by untrained historians. They know their immediate subjects — at least to a degree — but they aren’t steeped enough in other relevant histories to dig more deeply into their work. There’s more to extract (and it’s obvious to a historian), but they don’t even know what is there to mine. Sadly, no one will go back to more richly treat what Grann has already popularized now. It’s a loss and so typical of the lightweight histories that dominate the best-seller lists today.

More depressingly, the Osage murders remain a novelty this way — a curious story worthy of passing interest that never gets incorporated into the larger account of race relations, justice, and economics in the history of the State and States. Context would bridge the gap in this tale, but a journalist wouldn’t see the lack — and historians are too busy with their monographs and professional interests to bother. The result is perpetuation of the simple narratives that seem so indicative of US histories. There’s always so much more to the stories than the stories themselves.

m[-_-]

 Posted by at 7:29 pm
Nov 282017
 

Jun 112017
 

Conservatives — and by this I mean those interested in preserving the status quo in our society — have lit upon a new-fashioned fig leaf to cover their racism; their disguise isn’t half as clever as they think it to be, however. Their pretext is to claim that they are defending history from those who want to obfuscate through monumenticide. Posing as preservationists, they oppose removing the monuments, site names, and other honors given to Confederate and racist figures from our past. Conveniently, this sudden dedication to public history blocks efforts to denude southern cities of honorific remnants of our slaver past.

In the New York Times, Gary Shapiro blames the dilemma on “deferred maintenance of history” — whatever that nonsensical jargon means — but what he is trying to get at is that racists and their sympathizers object to removing Confederate emblems because it targets the legacy of white supremacy in our society. Part of disavowing racism is dishonoring it. Conservatives reject that repudiation, which is really just a first step in pursuing true equality in our country. They don’t want change (Indeed, they want to make our country “great” again.), which means white supremacy remains.

Now, while conservatives imagine their position smacks of historicism, intellectualism, and post-racialism — and perhaps it does to those who are historically ignorant (like its proponents), for those with any experience in practicing history, the bias is obvious. Sure, on its face, the traditionalists’ position claims to oppose historical denialism, sweeping the ugly parts of our history under the rug. Staters of this day — and in the future — should know that our ancestors celebrated and promoted racists — and it was certainly, in part, because of their racism. Richmond’s Monument Row lacks the instructive qualities that teach the past while condemning it, however, and it’s this indistinguishability between honor and historical recognition where the conservative approach fails.

Assuming good faith on the part of some traditionalists, it seems obvious that their defensiveness lies in their naivete. They believe, like so many, that history is canon and that mastering it means learning facts about the past. This is not, however, what history actually is. It is a practice — of collecting evidence and putting it together in interesting stories that tell us things about ourselves. A row of statues itself is not a history. It is just a collection of evidence. An exhibit or public history display does more than present pieces for people to observe. They are selected, arranged, and contextualized (with accompanying commentary or through careful presentation) so as to make a statement and encourage learning. Monuments are not historical exhibits. They are honorary displays, and fail as cautionary lessons.

In his essay, Shapiro suggested that adding instructive text and, perhaps, statues of slaves to Monument Row, as a way of rehabbing the display so that it reflects contemporary values. Merely cluttering the space is not the answer, however. More importantly, it makes for bad history. Overwhelming the site with too many statues creates a historical junkyard — not a cohesive, instructive narrative. Further, it does nothing to rectify the moral problem that the Confederate figures would still be recognized, while symbolic Anyman slave figures would perpetuate the dehumanization of black Americans. Presented namelessly and generalized — once more denied specific identities and singular significance like that of the white figures beside them — they would be again denied privilege of individuality, dignity of personhood, and historical actuality thus demonstrated. It’s likely that this kind of historiographical dilemma is unfamiliar to Shapiro and his like, precisely because they have no experience constructing histories. Unfortunately, such ignorance perpetuates discriminatory treatment.

Ours is not the first society to confront a repugnant past. The conservative element in our country does not see in removing Confederate monuments the populist toppling of statues of Vladimir Lenin or Saddam Hussein, however. The traditionalists still respect Lee and Davis; those men are not ignoble monsters to be rejected like the Communist dictator and Iraqi strongman, to them. In part, this is because conservatives are victims of the apologetic historiography they were raised in, which honored Confederate figures. However, those who want to reject racism must choose to repudiate those figures, and willfully failing to do so is a contemporary act of racial prejudice. Fundamentally, disavowing racism requires the dishonoring of these Confederate figures, specifically through their removal. Monument Row cannot be reconstructed. It must be dismantled. And, any deferral of that — especially in the name of history — is a great misfeasance.

m[-_-]

Feb 292016
 
In just over sixty years from first contact with European and white adventurers — less than life expectancy in the US on average today — the native population in Hawai’i declined by more than 75%. Much of this death was the result of the introduction of diseases to which Hawai’ians had no immunity. Sailors and well-meaning missionaries brought with them germs that decimated the established communities there, rapidly and mercilessly.
 
Once established, white settlers introduced political and cultural changes as well. They convinced even the king to embrace western ways, including individual land-ownership. In 1848, King Kamehameha III issued the Mahele — a decree that permitted Hawai’ians to own land, which was previously solely a royal prerogative.  Two years later, the Kuleana Act allowed foreigners to purchase real estate from native sellers. For the children of missionaries, this offered the secular option of pursuing agricultural endeavors rather than the Lord’s work, and those who were not called chose commercial farming instead.
 
Sugar was the premiere cash crop, and two of the Big Five — as the biggest sugar producers became known — were started by these sons of evangelists (the others by various enterprising white men).  The growers’ dominance of the economy led to equally significant political power, and they effectively served as a de facto oligarchy controlling the Hawai’ian economy and society. Several were instrumental in the 1893 bloodless coup that overthrew the monarchy and created the Republic of Hawai’i. They were again involved in annexation of the territory by the US.
 
By 1920, only 24,000 native Hawai’ians remained on the island, with only 10% of island real estate still owned by these survivors. Today, just a quarter of the state population claims any native ancestry. The social, political, and economic order there has been completely upended and remade — in great part through the effort of the powerful sugar companies.
 
Recently, Alexander & Baldwin announced that it is shutting down it’s last sugar plantation. After 145 years, the corporation — one of the Big Five,  founded by missionaries’ sons Samuel Thomas Alexander & Henry Perrine Baldwin is abandoning it’s once fertile agricultural pursuit. With the end of this year’s sugar harvest, the 675 employees of the once powerful company will join a workforce that has long since left field work behind — their skills as relevant in the 21st century as the intentions of their employer’s founders.

Alexander & Baldwin’s parents hoped to save and civilize the Hawai’ian people, while their sons hoped for the American dream — transplanted to an island paradise. The cost for it all was thousands of native lives and the end of many traditional ways and practices. What a stiff price for a mere 150 years of commercial success. It seems an utter waste that such sacrifice shouldn’t lead to more permanent structures and noble accomplishments.
 
A bitter aspect of this “White Man’s Burden” is the cruel brevity that demeans the Hawai’ians’ horrible loss. For a few generations of wealth, the better part of a society was lost. The blow seems too great for the reward. The white men’s success was too dearly bought. Now that we can measure its duration and close the history, the brutality of its temporality becomes woefully apparent.
 
m[-_-]
 Posted by at 11:27 pm
Dec 182015
 
The initial announcements were inconceivable — multiple planes had been hijacked and crashed deliberately into financial and military meccas on the east coast of the US — and for a people unused to military attacks on their home country, terrifying. It was happening here, and our defense was caught unawares. The reality stunned and the plan’s effect perfect in all regards.
 
In the panic that came afterwards, it was clear that we do not understand ourselves and our own history. This was true on a micro level: New Yorkers earnestly and ignorantly repeated the promises and beliefs that survivors of the Murrah bombing had made before. Outside of Oklahoma, people didn’t really understand what was in store for them over the next six months or year. Their unfamiliarity was rooted in historical ignorance of the aftermath in OKC. Determination was all it seemed to require to rebuild, unless you knew what had happened after the bombing — in which case, you just grieved.
 
Ignorance fueled responses on a larger level too. On the one hand, it tinged the tragedy with irony that a nation born when terrorists funded and encouraged by enemy nations overthrew a world superpower was now the superpower undone by sponsored idealistic warriors. Staters seemed oblivious to that. At the same time, so many questioned our poor defense and wondered why there was no centralized control over the agencies to whom these responsibilities were assigned. When it was clear that those in positions to protect the country failed to coordinate responses and even intelligence that might have prevented the attacks, the demands for change were potent.
 
So, fear of more terrorism drove people ignorant of their history to make sweeping changes that defied American principles and practices to that point. Until then, the dispersion of power and political decentralization was intentional. Our predecessors feared government by standing army and a military threat to civilian authority; they would not create a Napoleon from our republic. The national guard was divided accordingly, and means taken to keep any single governmental agency or leader from having the opportunity to seize control through centralized power. Similarly, as federal authority over security matters expanded, their powers were again divided, and the agencies responsible were more often rivals than collaborators. Accordingly, even as the federal government swelled in size and importance, there were checks and balances, but the same conscious choices that prevented potential coups in this way left us vulnerable to guerilla tactics.
 
Those who knew this history largely forgot it in the post-9/11 hysteria.  Or, they were willing to trade productive political constraints for emotional and psychological cover. But most Staters simply didn’t know better and, therefore, made no opposition as hundreds of years of precedent were suddenly undone. No one wondered if the protective shield promised by the new umbrella agency — the Department of Homeland Security — was also a restrictive cage. Sinclair Lewis predicted (It Can’t Happen Here, 1935) that dictatorship would come from welfare, but it seems more likely now that people would welcome it simply because they were afraid of outsiders.
 
The antidote is, of course, education. Americans fear Islamists particularly because they don’t understand them; they do not fear homegrown killers the same way. Racism is certainly a component of this fear as well, but, again, the way to overcome that is education. The best protection is informed understanding and educated responses.
 
And, of course, better historical education might have given us pause to consider whether it was American to centralize domestic defense in the aftermath and if we really wanted to reimagine what was “American,” if not. But, we were afraid, and when we fear, we don’t think historically. This is how frightful events spur unconscious social and political breaks, from angst and ignorance.
 
m[-_-]
 Posted by at 10:16 am
Nov 242015
 
      “A crisis is something people must live with
       until change has occurred and stability is
       restored.” — Dan Nimmo, TV Network News
       Coverage of Three Mile Island:  Reporting
       Disasters as Technological Fables
 
In the pre-dawn hours of March 28, 1979, a pressure valve failure in the Three Mile Island nuclear facility’s Unit 2 reactor precipitated the worst accident of its kind in US history. The reactor had been dedicated just a year earlier (an expansion of the Pennsylvania site’s capacity) and lauded as a superlative achievement in nuclear power. In the waning years of the Cold War, when the economics of the energy industry left the nation badly bruised, the atomic threat we had so feared promised a cleaner, cheaper electricity source — and the plant’s champions intended it as an example of the wonder of harnessing nuclear power. Yet, that spring, the cutting edge facility seemed just a source of an uncontrollable radiation hazard.
 
Naturally, the public was greatly alarmed when word of a core meltdown broke, and press coverage helped stir up excitement.  Just a couple of weeks before, The China Syndrome, a fictional movie about an accident at a nuclear power plant, had premiered. The tense drama then seemed true to life, and public anxiety about nuclear power mushroomed accordingly.
 
When Pennsylvania Governor Dick Thornburgh called for the evacuation of pregnant women and small children from the area near the plant two days later, concerns escalated even more. A number of local residents emptied their bank accounts and fled, while the media reported that there was chance of a giant explosion from a hydrogen bubble in the reactor threatening. A public raised in the shadow of the atomic bombings of World War II and the threats of the Cold War nearly panicked.
 
As chance would have it, though, then-President Jimmy Carter was a trained nuclear engineer with field experience from a similar incident in Canada. In the weeks after the accident, he personally received twice-daily briefings about conditions at Three Mile Island from Harold Denton of the Nuclear Regulatory Commission, complete with technical updates on the situation. Carter had confidence, based on his expertise and the information received, that there was no substantial threat to the public, and he didn’t bother to address the subject to the media beyond some cursory initial comments. He proceeded with his regular political appearances even as news agencies focused on the “nuclear nightmare.”
 
By the 4th day, Carter’s staff suggested that a calming act would be highly beneficial (treating it implicitly as if it were no threat had not assuaged fears), so on April 1st, Carter — with his wife, Rosalynn, beside him — flew to the plant and toured it with reporters in tow. Afterwards, the President spoke for a few minutes to a crowd at a nearby school gym and shook hands with some locals.  The message was clear: there was no real danger if the President himself (and the First Lady) would go there to visit.
 
That same day, experts declared the threat over, as the size of the hydrogen bubble inside the reactor shrank and danger of an explosion was dismissed.  The immediate crisis had ended, but there would be long-term fallout from the accidental stoking of public anxiety. New practices and requirements were introduced in order to appease anxious citizens — from upgrading and expanding protective equipment and monitoring devices at sites to additional training for staff and resident regulators assigned to nuclear facilities with a 24-hour central operations center to which they reported.  At Three Mile Island, Unit 2 was decommissioned, and it never operated again.
 
These attempts to reassure the public had limited results. Opposition to expanding nuclear power facilities in the US remained high afterwards — lasting almost two generations and effectively curtailing many new construction projects until recently — though stockpiling nuclear weapons continued routinely. This year, regulators have okayed operation of a new plant (which tellingly is a government project through the TVA). If successful, it joins about sixty other operational plants around the country, indicating that though we fear the source, we — perhaps reluctantly even — continue to coexist with it anyway.
 
m[-_-]
 Posted by at 11:49 am
Oct 302015
 
From the moment the mushroom cloud erupted at Hiroshima, leaving rubble and disfigured bodies in its wake, humans came to fear a peril of their own making. Then after the Cold War was decreed, fears of an atomic attack escalated. Citizens who could hardly understand the lethal power of the weapon were forced to live with the daily threat of a nuclear event. In 1950, President Harry Truman authorized the Federal Civil Defense Administration, which undertook a campaign to help Americans feel safer living with that risk.
 
The agency put out educational literature and most famously, the Duck and Cover video that millions of school children watched as part of their “survival training.” Youngsters drilled to assure they would be ready for the looming attack that they were warned could come at any moment. In the movie, children were depicted seeking shelter riding a bike or walking down a street, when on a playground, and in their classroom. The message for the young audience was that they were never free from a threat — even at their most playful moments.
 
Ostensibly, the purpose of this material was to make people — and especially kids — feel safer. The government intended to give the public confidence that they could survive a bomb blast, even if it was a false hope. The information and drill instruction offered action to combat the fear that the new reality of nuclear war engendered.  But, the effort became less of a balm than an accelerant.
 
A generation of schoolchildren were regularly subjected to reminders that a horrible attack could be imminent. That awareness served as a constant anxiety absorbed into the background of everyday life, along with the other marvels of technology that shaped post-war culture. The Bomb was a looming peril belying the security of suburbia, and the regular school preparedness drills hammered that reality into impressionable young minds.
 
The fear of their fear and the need to remedy that fear created its own hysteria — a new Red Scare and atomic anxiety. A generation raised on bomb warnings and drills naturally responded to the angst about a nuclear attack that never came. That fear helped define them and shape their choices. They were markedly different than the generation that followed the War to End All Wars — no less hedonistic, perhaps, but more expectant that they would be protected.
 
Preparedness drills and civil defense materials probably fed the illusion that they could be kept safe, even as it made them afraid. The government stepped in to reassure them — but not to act to de-escalate the Cold War and reduce the likelihood of a nuclear war. In essence, the political leadership was more concerned about pacifying the public than averting the threat.  Fear of the people’s fear motivated them rather than the reality of their enemies’ power, and it took some time to acknowledge the insanity of that position.
 
Bizarrely, then, the nuclear age spawned government-as-protector even as it was the cause, also, of the dread.  That the state would develop and employ these weapons is not surprising, but its efforts to serve as public comforter while doing so certainly is. Thus, even as it fueled anti-communistic hysteria, it tried to assuage (poorly and with the opposite effect) the fears of citizens about the nuclear menace. Which, then, was the political leadership’s real fear and what did they do to us?
 
m[-_-]
 
 Posted by at 10:03 am
Sep 302015
 
A mild-looking white man with thin hair, Eugene Debs addressed the crowd in Canton, OH on June 16, 1918 with conviction.  Working men, he said, were the ones fighting wars, but they never had the say in making them or in settling the peace.  “You have your lives to lose,” he told them, “you certainly ought to have the right to declare war if you consider a war necessary.”
 
Pointedly, he then went on to praise activists who had been jailed for opposing war, an undertaking Debs said the rich and powerful had committed the country to in their interests, while the working class did the fighting for them.  Afterwards, Debs was arrested too for running afoul of the Sedition Act of 1918.  The law made it illegal to attempt to obstruct recruiting and enlistment efforts for World War I (equating it to disloyalty, abuse of a military uniform, and flying another country’s flag). Not for the first time, Debs’ case went all the way to the Supreme Court, which held that his speeches did incite opposition to serving in the armed forces.
 
The Supreme Court’s ruling confirmed the legislative and executive branches’ established limit on citizens’ rights to speak out.  Words became seditious threats in and of themselves.  Accordingly, it was permissible to repress speech and the sharing of ideas that was contrary to government aims. It’s ironic that the fear of words and hostility to exposing others to political thought established here came under the administration of Woodrow Wilson, a scholar and proponent of education.  Obviously, the justification was the threat posed to the Republic.
 
In the previous century, the US almost broke apart on two occasions over government policy.  Both issues had significant economic effects on the opposition — the tariff and slavery.  In response, Andrew Jackson threatened to hang any treasonous actor who took up arms against the Union, and a compromise to resolve the tariff and nullification dispute without violence was reached by Congressional leadership instead.  When rebellion rose again later, Abraham Lincoln reluctantly accepted that military force was necessary to end it — and, eventually, slavery.
 
In the 20th century, Wilson faced a different dilemma, which was no less about economic impact. Controlling the working class was essential to the economy and the government’s ability to wage war. However, this time, there was no rebellion.  Debs and other activists were not trying to secede or overthrow the government.  What they wanted was to shift policy and to create a democracy more responsive to workers and their needs.  Theirs was a policy battle — not a physical one.
 
But, the federal government — the whole of US society even — was in a different place then.  Police departments that did not exist under Jackson now patrolled city streets.  Bureaucratic agencies designed to control where and how citizens lived, worked, and raised their children that Lincoln did not know were available to Wilson, giving him the ability to shape cultural ideology in a way that many previous presidents did not.  Washington did not and could not control access to medical information, citizenship entitlements, and employer-employee relations in the way Wilson could (thanks to the post office, treasury department, Bureau of Labor, etc).  The world had changed — the country and its government.
 
Part of this change resulted from a fear of immigrants and differing ideas and values that developed along with industrialization.  As the populace became less homogenized in kind and more stratified economically, ideological differences multiplied and traditional appeals would not suffice. Administrative enforcement mechanisms became the favored tool for social control, rather than persuasion (think the post office and sending lottery tickets or birth control information through the mail).
 
Justification for these bureaucratic means then became contestable too. Done ostensibly for a societal benefit, these acts opened doors for other arguments in favor of the public good (trumping individual rights).  They encouraged popular democracy and social activism. Accordingly, words — speech that could incite coordination of the ballot and effective resistance to cooperate with policies that favored the established elite became threats to 20th century leadership.  The voting public had changed, as had the means to control the citizenry (including oaths of allegiance).
 
John Adams signed his Alien and Sedition Acts in 1798, and it cost his party the next elections.  Congressional support dissipated thereafter and the tide turned to favor anti-Federalist positions.  When Thomas Jefferson replaced Adams as President, he pardoned those who’d been jailed by the Adams administration. The opposition freed, the backlash against oppression of speech changed political policy and discourse thereafter.
 
A similar political shift could threaten Wilson and other proponents of policies opposed by the more numerous working class voters. This did not deter the political leadership of that time from undertaking political repression anyway. If the approach was a repeat, the tone was significantly different. Stifling vocal opposition became a tactic that while not new, expanded with darker effect: excluding participants from political conversations as the face of democracy in the US shifted.
 
No longer was the conflict between white men of property with differing philosophies.  Now, it was traditional WASP culture versus those whose resistance to wealth and power had erupted in protests, riots and armed conflicts in cities across the country in the late nineteenth and early twentieth centuries (for example, the mine field wars in West Virginia and Colorado).  It would take far more (and hardly justifiable) police effort to keep these new democrats down, and thus, it was far more important to control the political dialogue in order to forestall opposition before conflicts required intervention of federal troops. What’s more, avoiding a military response is especially preferable when engaging those with the democratic high ground.  No one could understand that better than a president who suffered the PR nightmare of force feeding suffragists arrested because they wanted to vote.
 
In previous times, political leaders coopted the support of the poorer elements through the ideology of white supremacy.  It divided workers with common economic interests through social status — and even some economic benefit. The ideology of the socialists and unions and their like threatened the coalition of whiteness. As such, it threatened the whole social and cultural order — not just political programs.  Wilson, a champion of segregation as well, pressed tactics to stop the threats to the status quo.
 
For awhile, government witch hunts did stifle political opposition, as the country was enveloped in the hysteria of the Red Scare. In time, though, the frenzy ebbed and later the Supreme Court would reset the boundaries on free speech.  Still, the labels and hostility to particular ideologies pressed by Wilson et al linger even today.  The government may not lock up Socialists like it used to, but it doesn’t have to thanks to the pervasive anti-left bias it seeded so many years ago.
 
m[-_-]
 Posted by at 3:01 pm
Sep 022015
 

In the daylight on September 3, 1885, the survivors crept down from their hiding places in the hills. Their homes burned by a white mob the night before, the surviving Chinese mine workers had no safe retreat. They scattered along the Wyoming countryside, sobbing behind brush and rocks for what they’d seen and praying for help. Their employer — the Union Pacific Railroad, for whom they dug coal to run trains — wired local stations for engineers to stop and pick up survivors along the tracks. Deposited one hundred miles away in Evanston, the scores of rescued Rock Springs miners took refuge with the community of Chinese workers there.

Back home, their houses had been looted by white citizens — including the marm who taught them English — before they were burned. In surveying the damage afterwards, company representatives found half-charred bodies of victims trapped in the burned-out company housing and mutilated corpses in the streets. A few they buried; the rest they left for dogs and other animals to pick over. Twenty-eight was the official death count. Another fifteen were wounded, and property damage ran well over a hundred thousand dollars. The survivors lost everything they’d worked, crammed eight and nine to a house to save on rent, to build — made all the more bitter that they had just purchased their monthly supplies at the company store the day before the riot. Their full provisions were lost to looters and arsonists. They had no food or supplies left to rescue from the ashes.

Afraid for their lives, they did not want to return for the remnants anyway. Instead, they appealed to the railroad for tickets to leave the territory and the two months back wages they were owed to start someplace new. The company declined. It had brought them in as cheap labor to undercut unionized white miners and was determined to retain its workforce. So, the survivors lingered in Evanston, where they acquired weapons to protect themselves in case of more attacks from the armed white mobs building elsewhere in the area. Federal troops finally arrived to preserve a tense peace, though everyone feared another massacre would erupt. Finally, the railroad relented: the 600 Chinese men were loaded into boxcars to convey them safely to San Francisco, far from the hostility of Wyoming mines. After just a short ride in the dark cars, however, the train stopped and the doors opened onto the ruins of Rock Springs.

The boxcars were the survivors’ immobile homes for the next days. Stranded against their will, the workers resisted their boss’ demanded they return to work at the mines. In the meantime, the company provided them emergency provisions and clothing, and the army provided them protection. Afraid of suffering further violence and angered at being tricked, the men held out against their employer’s wish, however. During the days, they loitered nervously; at night, they reported, they were troubled by “frightful dreams” and slept poorly. In desperation, sixty of them took off into the wilderness to make their own way. After a few days, in order to force the remaining survivors back to work, the company cut off supplies to the men. Desperate need forced them again into the mines, anxiety over potential additional violence from white residents compounding the stressful condition of their laboring. In addition to the usual fears about accidents and work hazards, they dreaded another attack from their coworkers daily.

This is how they rebuilt the “Chinatown” at Rock Springs. With a troubling cloud of fear overshadowing them, they worked the mines and restored their community. Specialty stores and services slowly re-established after new company housing provided miners stable residences and some grounding. Federal troops stayed for thirteen years to prevent more violence; their outpost situated between the segregated racial communities in town. White miners returned to work too, and no one was ever prosecuted for the murders, looting, and arson that had occurred. The Chinese workers went into the dark mines every day with whites they knew had brutally murdered their friends and neighbors. Tense productivity that served the railroad constituted the town norm, and the Asian immigrants who could neither leave nor gain legal equality as citizens thus involuntarily sacrificed to facilitate the economic boom that lured so many to the Land of Opportunity.

m[-_-]

 Posted by at 2:15 am