“Prefers Trump Republican Party”

Elephants 005 - Royal Hanneford Circus - Westchester NY - 2013-02-16

by Croak,

I just opened up my state’s primary ballot. Scrolling through the grocery-sized list of gubernatorial candidates, I see two of them “prefer Trump Republican Party”–my state uses a “top two” rule regardless of party affiliation while another one listed themself as “prefers Pre2016 Republican Party”.

I don’t know what this signifies other than any nut can run for governor and scribble whatever party preference they wish on their application in my state. We already know the Republican party has a fissure between the Never Trumpers and the double-down-on-crazy wings. That fringe candidates are officially declaring allegiances through customized party affiliation is no sign of a deepening split.

After all, Never Trumper resistance to Trump has been as a paper doll standing up to a charging elephant; one needn’t trample what lays on its own. No, if the Trump prefix indicates anything at all, it’s that his followers have no end of grievances even with their own party. But if anyone can rustle up a more docile herd of pachyderms than our current Republican party, then crate’em up and ship’em to th’ circus because a more tame group of animals does not exist.

Still, Trump followers can’t stand for a single trunk straying from the line. And liberals are the party of grievances and woke intolerances?


By Croak,

The beauty of the English language is its capacity to formulate new compound words.  This is particularly useful when applied to expletives.  For circumstances do call for terms stronger than our usual cast of tabooed words like fuck and shit.

That is how we end up with improvised combinations, from the unoriginal but still effective fuckshit (as in “who’s this fuckshit?”), or the many chemistries with high-valence words like face (shitface, fuckface) and head (shithead, dickhead).  Some compounds break a bond before forming a new molecule, as in fucktard, others are kinetically bound like asswipe.

Still others are seemingly oblique yet sit in a metastable state of logic due to their originality, like dipshit, asshat, and Devin Nunes.

See the source image

A douchewad posing as a congressman.

Look at this face.  Many have pointed out the punchability of this face, and personally, I cannot wish for a high velocity fist to make a Tunguska-like event of it, but if such an event were to occur, I hope it’s on YouTube.

To me, this mug begs for one of the many face-based insults one could formulate.  However, the previously mentioned shitface and fuckface won’t quite do since many smaller assbags have been assigned those appellations.  One could try lesser-used variations: twatface, cumface, doucheface, but they also fall short.

Speaking of douche, the face also invites the many -bag variants of insults: scumbag, shitbag, and douchebag itself.  But again, when so many less deserving, from bad bosses to rude drivers, have been called the same, can we really fire those spitballs at this face?

This face opens an opportunity for me to advocate a cause that has been a passion project of mine for the last twenty minutes: wad, more specifically, the many -wad compounds that could form the basis of a whole new branch of profane chemistry: shitwad, dickwad, fuckwad, asswad, twatwad, frackwad, douchewad, cumwad, snotwad, fartwad, suckwad.  Anyone of these frankeninsults could apply to that face.

After all, the word ‘wad’ evokes images of large rolls of dirty cash, three packs of overmasticated gum in the mouth of a callous teen, and the ejaculatory fluids of a grunting internet porn star who, from the inside of a van, waved a fan of cash to a buxom blonde pedestrian who happened to be walking by, wearing a mini-skirt and a belly shirt that revealed her navel piercings, and within fifteen minutes found her face smothered by the man’s shooting wad.  Because really, when you peer into the saggy cheeks of this dimwad’s face, the same face that sued a Twitter cow and is now suing CNN for $435M, what else comes to mind?

Fredster’s Millions

By Croak,

Trump is making strong inquiries into the purchase of Greenland from Denmark, a move that is somehow both surprising and predictable.  For what could be more Trumpian than overpaying for a piece of property that the original owner is struggling to maintain just to emboss his name in gold on it?

I’m going to offer an alternative reality that makes as much sense as our current one:  Donald Trump has been, in fact, living a real life reboot of the mostly forgotten 1985 film Brewster’s Millions.

The movie starred Richard Pryor as Montgomery Brewster, sole heir to a wealthy but eccentric relative who in his will stipulated that Monty would inherit $300 million only if he could spend $30 million within 30 days.  Of course, there were restrictions, Monty could: tell nobody about the bizarre stipulation, lose only a certain percent to gambling and charity, and keep none of the assets purchased with the $30M.

The drama in the movie centers around Monty’s attempts to burn through money while retaining nothing: renting expensive cars and clothes, hiring out the Yankees for an exhibition game, getting into bad investments.  Monty even runs for mayor of New York and is about to win when he decides to pull out or otherwise face a $60,000 “penalty” in taking on mayoral salary.

It’s unfortunate the original movie wasn’t a success, though its current Rotten Tomato score, which sits below 40 percent, may be harsh.  The story that Brewster’s Millions could tell today is just how hard it is to become not rich once a person is rich.  Think of all the potential “horrible investments” that sound like actual successful (at one time, at least) tech companies: an app that only messages “hey bro”, a machine that performs dozens of medical tests from a single drop of blood, a purely digital currency.

If my alternate reality were true, we would now be at the part of the movie when Monty invests in iceberg towing, a scheme that involves pushing icebergs from near the north pole to be sold at the Arabian desert, an idea which has been floating around for more than a century and is seriously being considered now as a solution for drought.

Of course, our main character, Donny, is the living embodiment of how the rich can’t get poor no matter how incompetently he manages.  And, of course, any modern reboot needs to subvert expectations.  Hence, in our version, Donny is already president, and rather than selling the icebergs to Arabs, he will probably give them away to the House of Saud in exchange for the rights to a new Trump Tower in Riyadh.  For if Trump is to succeed in losing enough money to inherit even more of Fred’s secret millions, what better way than to “invest” in another Trump property?

Bellum Omnium Contra Omnes

by Ribbit

Enough already.  Enough with the pretense of the sportsman’s semiautomatic rifle, as if it were the harmless plaything of the weekend warrior, or the needful implement of the noble hunter.  The AR-15, and other rifles like it, is neither of these things.  It is an instrument of war.

Yes, the AR-15 can be used for hunting.  So can a bolt-action rifle.  The former was specifically designed for infantry combat.  The latter was not.  Nobody is calling for a ban on bolt-action rifles, just as nobody is calling for a ban on cars, steak knives, or baseball bats.  These latter items were designed with specific uses in mind, none of them peculiar to soldiering — unlike the AR-15, which has no other legitimate purpose, and was never intended for civilian use.  Even Eugene Stoner’s family admits as much.  We already forbid the civilian ownership, sale, and possession of machine guns, which suggests a bygone recognition that the radiance of the Second Amendment extends only to some definite limit.  (Yes, I am aware that technically, you can legally own a machine gun manufactured before May 1986.  But the hurdles, both in cost and investigation, are very high indeed.  And the prohibition is otherwise absolute.)

The historical provenance of the Second Amendment — so ably discussed here — has long since lapsed out of memory.  It is a fossil buried deep within our founding mythology, invisible to modern sensibilities.  It’s high time that fossil was excavated, and exhibited to public view.  Its revelation would do little to persuade the paranoids, I’m sure, but it would provide some much needed context for debates about the meaning and application of this amendment.

Maybe that’s too hopeful.  Maybe rational public debate about controversial issues is no longer possible.  That is certainly true for the extremists, who fetishize military-grade weaponry and will never cede so much as a hairsbreadth on that score.  But happily, democracy does not require that we persuade everyone, and a substantial majority of our fellow citizens already believe that more restrictive gun legislation is both necessary and proper.  The problem isn’t so much a lack of public will, as the absence of its representation within Congress.  That is partly by design, of course — the Founders mistrusted fluctuations of public sentiment — but it is also because people like Mitch McConnell and his ilk have worked so very hard to enshrine minority rule at all levels of government.

Opponents of gun control like to point out that the vast majority of gun owners are law-abiding citizens.  That is certainly true, but it is also irrelevant.  More relevant is the easy availability among civilians of weapons that properly belong only in the hands of soldiers.  We already restrict ownership of hand grenades, and nobody is clamoring for those, just as nobody is clamoring for the free sale of operational tanks or anti-aircraft ordnance.  Why is it, then, that a vocal minority in this country has become so entirely fixated on possessing military configurations of semi-automatic rifles?  Perhaps that question answers itself.  These weapons are unnecessary for self-defense or hunting; indeed, they can have but one purpose.  I, for one, believe that purpose should be limited to the application of military force against our enemies during wartime.  That others feel differently, and justify their radicalism by appeal to paranoid fantasies of paramilitary insurrection, is frankly rather frightening.

I am well aware that the legal prohibition of so-called “assault weapons” would not stop all tragedies like the ones in Dayton and El Paso.  But it would undoubtedly help.  It would remove one of the easier options from a prospective killer’s repertoire, and deter the merely impulsive, deranged, or opportunistic, whose rages tend to be situational and transient.  Opponents of such legislation impose an impossible standard by insisting that it must end all killing everywhere at once; if that is the measure, then we may as well do away with our most ancient and venerable prohibition against murder itself, which is surely one of the least successful laws ever devised.

In any case, I don’t recall anyone raising such objections when the government finally started regulating the sale of ammonium nitrate.  People could see the logic then.  Now, they shut their eyes.  They shut their eyes against images of unutterable suffering and grief, because they know such images form an irrefutable challenge to their peculiar ideology.  Forced to choose between human feeling and inhuman conviction, they cling ever more desperately to the latter.

The calculus is simple.  We stand to gain far more in human life, than would be lost in liberty.  Let the scales of judgment tilt towards pity.

Canonical Cant

Grievous acts of lexical buzzkill, recorded.

by Ribbit

Corporate brand, n.

A numinous totem of the C-Suite mystery cult.  Like the Trinity, it embodies irreconcilable contradictions among inscrutable concepts.

Corporate lawyer, n.

Plausible deniability in human form.

Corporate officer, n.

An extravagantly priced pneumatic apparatus for recirculating stale air.  Frequently installed to ventilate board rooms and business class cabins.

Corporate strategy, n.

1. Reasoning backwards from shareholder prosperity to workforce penury.  Typically the purview of the Chief Financial Officer.
2. The name given to officer-level meetings involving metaphysical discussions of the corporate brand.
3.  (archaic) Long-term planning for the purpose of achieving optimal allocation of corporate assets and avoidance of obstacles to success.

See also: Make-work

Press release, n.

Disclosure calculated to conceal.

Soundbite, n.

The smallest possible quantum of mendacity.

Public Relations, n.

The art of dodging and lobbing in one smooth motion.

See also:  Press release, soundbite, shell game

Performance review, n.

Praise followed by parsimony.

See also:  Ritual scarification.

Small talk, n.

An exclusive claim laid to silence owned in common.

Human Resources, n.

Deadwood charged with holding the corporate matchstick.

Fox News, n.

Undead air. It hungers for what its prey can scarcely provide.

See also:  Zombie apocalypse, conservative talk radio.

White paper, n.

The fossilized remains of a previous deadline.

See also:  Doorstop, shelfware, flyswatter.

Elevator pitch, n.

Compressed air released in an enclosed space.

Extroversion, n.

A vivid camouflage rendering the wearer visible to all but himself.  Essential gear for executives, salesmen, and politicians, who must scrupulously avoid falling prey to introspection.

Introversion, n.

A form of leprosy symptomized by inward reflection and an outward aversion to frivolous speech.

Optimism, n.

An account credited from the arrears of reason.

Pessimism, n.

Optimism that has paid its debts, often with considerable interest.

Org Chart, n.

A map to trace the lineage of ill-conceived notions and misbegotten plans.

Agile development, n.

Navigating cyclones by means of the raised and wetted finger.

Reduction in force, n.

The ritual expiation of management sins through the periodic expulsion of a few village goats.

See also:  Layoffs, downsizing, corporate strategy.

Subcontracting, n.

Incompetence by proxy.

Management consulting, n.

Error by proxy.

Management consultant, n.

One who delivers reports to his employer in his employer’s handwriting.

Pivot, v.

Retain investment capital by discovering new ways to spend it.  The principal mode of innovation at startups.

Internet of Things, n.

An international cybersecurity employment initiative.  It calls for the introduction of new vulnerabilities into old appliances, by means of code written expressly for that purpose.

See also:  #NotMyToaster, #NotMyRefrigerator, #NotMyCoffeePot

Data science, n.

A category error masquerading as statistics.

See also:  Big Data, machine learning, computational astrology

Intellectual property, n.

The legal sanction given for the extraction of rent from one criminal enterprise by another.

See also: Nathan Myhrvold

TV Cult: Eat, Drink, and Be Weary for Tomorrow We Sigh

by Croak,

As Game of Thrones nears its climatic battle at Winterfell, the first two episodes of Season 8 has been filled with reunions and emotional setup, which can only mean that fan favorites are sure to die in the coming episodes. A formula true to the buzz seeking ethos of the show in its later stages.

Early in the GoT‘s run and staying true to its source material, the show had genuinely shocking moments. Children tossed from towers, heads rolled from noble heroes, blood poured from weddings. Nobody was safe from the carnage, but the carnage was well calibrated by the careful planning of George RR Martin. The fates of characters we loved and hated were written into a cosmic astrolabe, hard to predict but possessing a beautiful natural order, as if the plot were governed by historical laws the reader didn’t possess. If the events seemed shocking, the fault was in our geocentric model of the universe where the heads of noble heroes are not mounted on spikes to be sneered at by pissy teenagers. Martin didn’t subvert expectations simply for shock value, fantasy realism was the core of his opus, and the consequences fit their actions.

Since Season 4, D.B. Weiss and David Benioff–GoT‘s showrunners–began their long wobble away from Martin’s books: Ramsay Bolton rose as a major villain, the Martell story lines were minimized (and the leftovers made into a bad cartoon), background characters such as Bronn were moved to the foreground, Lady Stoneheart was dumped.  The show grew increasingly reliant on spectacles to create Twitter chatter, trying to mimic the natural shocks that Martin wrote so deftly.

Some like Hodor’s death had a touch of genius, others like Hardhome and the Battle of the Bastards were at least well-crafted even if illogical, but yet others like the destruction of the Sept of Baelor, Aria’s battle with the Waif, the death of Ramsay Bolton and Littlefinger were sloppy affairs that pandered to the creation of Big Events.

“A Knight of the Seven Kingdoms” showed how hollow the show has become. “Last night on Earth” tropes are an opportunity to uncover the true nature of characters, for them to reflect upon their journey and to feel the very texture of their existence; when done right, they burn with life. Yet, Episode 2 was laden with an hour of back-to-back conversations full of rue, people sitting in darkened rooms brooding over their fates.  Even the show’s namesake scene–a scene with great symbolic potential–was slathered in an unhealthy serving of treacle.  Compare the episode with the dynamism of the Red Keep scenes in “Blackwater” (Season 2, Episode 9), thick with tension and intrigue, still setting the ragged edges of its characters. Now, the show holds about as much suspense as the NCAA tourney, complete with office pools betting on who will survive and who will sit upon the Iron Throne.

The show’s recent failings stem from a lack of interest in the intricate machinery that made the stars dance in earlier seasons, in understanding the subtle interplay between character and the course of events, action and reaction; instead, it has attempted to mimic earlier outcomes by relying on shock and the creation of Twitter-worthy memes. Episode 3 is likely to produce many more of them with the Night King and the army of the undead come to besiege Winterfell. The invaders may find, however, that Westeros is already occupied by zombies.

A Failure Wrapped in a Folly inside a Farce

by Ribbit

There’s been a spate of recent articles accusing progressives like Elizabeth Warren of rank partisanship for proposing the radical and dangerous idea that maybe, just maybe, we should dismantle that ridiculous anachronism called the Electoral College.  I don’t bat an eye at the hypocrisy on display — it comes with the territory — but I do have to wonder at some of the reasons these so-called “intellectuals” supply in defense of this undemocratic monstrosity.

I hold to the view that an election is about discovering, and enacting, the will of the people; and that overturning this will, which is the true wellspring of any legitimate government, requires some rather strong justifications.  Most of the opposition literature I’ve read fails to surmount even the most basic hurdles in this regard, relying as it does on an absurd application of the notion of “state’s interests.”  This concept is vital even among avowed libertarians like Kevin D. Williamson, for whom such collective categories ought to be essentially meaningless.  From what I can gather, a proper consideration of state’s interests is supposed to prevent a catastrophe known as the “tyranny of the majority.”  This means conferring four times the voting power on people living in Wyoming as those living in California, thereby halting our collective lapse into that majoritarian horror show in which power-mad urbanites subdue helpless country folk to their arbitrary will.  There’s no real explanation for how this subjugation is supposed to work — it’s not as if Californians would have any say about where or whether some Wyoming municipality builds its bridges — but the image conjures all the requisite fears in the minds of its intended audience.

I have to say, this is one of the strangest and most perverse arguments I’ve heard in a long while.  It’s a direct descendant of the kinds of rationalizations mustered by fusty antebellum Southerners in defense of their “way of life” below the Mason Dixon line.  In fact, the only real alternative I can see to the “tyranny of the majority” is its literal opposite, the enshrinement of one or another species of minority rule, which is surely the less defensible option.  I agree that, at least regarding important decisions affecting the entire body politic, narrow majorities are less desirable than large ones, but that should make outright minority rule even more repugnant.  Yet minority rule is precisely what people like Williamson are defending.

The office of the president is a national office.  She represents the nation as a whole, not some parochial subdivision, so I can’t see why we should privilege the least populated states in the process of electing her.  To invoke a faintly libertarian argument, the people have voted with their feet.  They’ve decided they would rather live in New York than Alaska.  It makes no sense to punish them for making that choice; after all, they’re only doing the rational thing, by migrating to where the opportunities are.  Shouldn’t we be encouraging them to do that, instead of rewarding the losers who languish in some economic backwater like Arkansas or Kentucky?  Why bestow such disproportionate political power on people who consistently choose stagnation over dynamism, failure over opportunity?  I’m not a libertarian, so I can’t seriously endorse this argument, but it makes about as much sense to me as anything else they’ve ever said.

Consider that the founding fathers intended the Electoral College as a bulwark against precisely the outcome of the last election, in which a dangerous demagogue ascended to the nation’s highest office.  Consider also that it made that outcome possible in the first place by robbing the majority of their own, infinitely wiser, choice.  Failure, thy name is Article II, Section I.

I know, I know.  Warren’s proposal for a constitutional amendment abolishing this idiocy signals her totalitarian ambitions, her contempt for tradition and the rule of law, her radical zeal for corrupt mobocracy — or something.   I’m sure she’d be twirling her moustache if she had one.

Film Cult: The Moral Maze of Chinatown

by Ribbit

“Forget it, Jake.  It’s Chinatown.”

So says Walsh to his boss Jake Gittes, the private investigator at the center of Roman Polanski’s eponymous masterpiece.  That final line has provoked much thoughtful commentary in the years since the movie first opened in 1974.  Is Chinatown a “state of mind”?  A symbol of irredeemable corruption?  Or merely a personal reference to an unfortunate incident in Jake’s own past?  It is all of these things, and more.

The story has all the lineaments of classical Greek tragedy:  the dogged hero in pursuit of truth, whose actions bring about the very horror he is striving to prevent.  Perhaps the only real quibble with this claim is that a fifth century Greek would have known precisely which horrors awaited the protagonist, whereas Chinatown hews exclusively to Gittes’s limited point of view.  There’s no real sense of dramatic irony here; we know nothing Jake doesn’t.  The audience muddles alongside him as he struggles to weave emerging facts into an increasingly sprawling web of conjectural narrative.

Of course, it’s not just the facts themselves, but their proper interpretation that eludes him.  He discovers clues, but lacks the context to give them meaning.  He interrogates various players in the narrative, but can’t understand what they are saying.  A key piece of sartorial evidence shimmers provocatively in front of him right from the beginning, but he is unable grasp its significance until far too late.  He spends most of the movie trapped in a kind of semantic limbo, and purely for want of the right kind of imagination — his basic decency prevents him from speculating beyond the bounds of conventional venality.

This incapacity makes him rather ill-suited to confront the peculiar quality of evil embodied by Noah Cross.  It’s true he deduces, correctly, that Cross himself orchestrated the scheme involving the Water Department, but he is tardy in understanding the nature of the quarrel between Cross and his former business partner Hollis Mulwray.  He also deduces, also correctly, that Mulwray discovered the scheme, and quite naturally assumes that Cross murdered him for it, or else that his wife Evelyn did it herself out of jealousy for his supposed infidelities.  Of course, neither of these dueling interpretations proves out.  The script is ingeniously constructed to sustain a maximum of ambiguity at every step, so much so that each of Jake’s hurriedly constructed revisions of his provisional narrative seem perfectly plausible and unobjectionable, as do his consequent actions.

But it’s these very actions, well intentioned as they are, that bring about the final tragic events of the film; therein lies its quality of “Greekness,” its pervasive sense of a fated outcome. The irony of this outcome derives not from any foreknowledge we might have of it, but from our approval of the interventions that produced it.  The core theme of the film can be found in an earlier conversation between Jake and his employer Evelyn Mulwray, a desultory, post-coital exchange about Jake’s time as a cop working the Chinatown beat.  “You can’t always tell what’s going on there,” he says, before proceeding to explain that he had once tried to protect one of its residents from an unspecified danger, only to wind up causing her harm.  This is as close to prophecy as we will come in this film, and it is uttered not by some oracle or seer, but by the protagonist himself, the very man for whom the web of intrigue surrounding him is the most inscrutable.

The movie hinges on the epistemological gulf separating facts from interpretations — things as they are, versus what those things mean.   These two concepts form a closed circle.  Facts are chemical agents whose meaning emerges only in combination; they cannot speak for themselves.  And our interpretations depend rather essentially on the order in which we acquire those facts.  Jake was in possession of many facts, even some of the most important ones, but they were useless to him — he lacked the one crucial detail that would have brought the others into focus.  The trouble is, there was no way for him to know with any certainty that his mental picture was incomplete.

This presents Jake, and by extension, all of us, with an impossible conundrum:  how to act morally, when the consequences of our choices fork away from us towards dark and unknowable ends.  His own solution, refined through years of navigating ambiguous situations with duplicitous clients, is adherence to a personal code.  Yes, he is willing to cross certain lines — after all, he sleeps with a client, and well before he is certain of her innocence — but he never abandons his own core sense of right and wrong.  He indulges some petty subterfuge, but he never does anything outright immoral in his quest for the truth.  And he never allows his loyalties to interfere with his ethical obligations.

Perhaps the closest analogue in Greek myth is the story of the Minotaur.  Jake’s personal code is like a thread in the narrative labyrinth, guiding him, hand over hand, towards an outcome he cannot see.  The monster at the center of this labyrinth is a man of obscure intention and implacable will, and there is nothing Jake can do to stop him.  If moral systems are intended to help guide us towards right action, this movie suggests that there are situations where even the most conservative such systems — systems founded on duty rather than consequence — can completely fail us, or worse, lead us to injure the ones we love.

One might object that the horrifying conclusion of the film has more to do with Jake’s precipitousness than with any deficiency in his moral system, and there is some justice in this claim, but consider also that many of his choices are made under the duress of exigency.  Each revelation seems to demand an urgent response, and Jake is not the kind of person to ignore such demands.  His essential character is oriented towards action, industriousness, and justice.  In the Western tradition, these are usually considered virtues, but the pervasive corruption of the Los Angeles milieu perverts these virtues into catalysts of horror and destruction.  This is the tragedy of a man who is so desperate to do good, he becomes the unwitting instrument of an unspeakable evil.

The cinematography provides an ironic counterpoint to the film’s subterranean themes by drenching most of the action in California sunshine.  The ripe glow of an orange grove becomes an occasion for suspicion and misplaced animus.  Water sparkles under golden skies as sodden corpses are dragged ashore.  Plots and schemes proceed in glints and gleams.  This is not a movie of dark interiors or cloudy climes, for the splendor obscures a deeper obscurity:  in Chinatown, what is visible is also inscrutable.  Illusions must be seen in order to beguile, and the substance of mirages is light and air.

Hard Labor

by Ribbit

There’s a scene in the 1998 film Primary Colors where the protagonist, a philandering Democratic presidential candidate named Jack Stanton, addresses a union of troubled factory workers. “Muscle jobs go where muscle labor is cheap,” he tells the skeptical crowd, “and that is not here. So if you wanna compete, you’re gonna have to exercise a different set of muscles. The one between your ears.” These words elicit rapturous approval from Jack’s idealistic campaign manager, Henry Burton, who is eager to assuage his burgeoning doubts about his boss’s character.

Back when I first watched the film, I felt much as Henry did — enthralled by the allure of a dream. I was a twenty-three year old graduate student in hot pursuit of a career in physics.  I believed in the power of education. I believed in the verity of the Enlightenment values I had absorbed as a child — values like intellectual freedom, moral progress, empiricism, tolerance, and individual worth. Jack’s speech is built on these values. He proposes lifelong education and self-betterment as the proper path to economic salvation; indeed, as the most effective solution to the perennial problems of capitalism itself.

At the time, I would have agreed with him. But that was twenty years ago. I suppose my views have changed since then.

I was reared in what might charitably be called a working class neighborhood. When I was eleven or twelve, my stepfather received a discharge from the army — on medical grounds, ostensibly, though there may have been more to it than that. After nearly twenty years of service as a non-commissioned officer, this was a sudden and traumatic rupture for him, and he struggled to find work. He ultimately landed a job as a bus driver, and later as a high school janitor. Lest this latter fact conjure images of that wry and witty broom-pusher from The Breakfast Club, I should also mention that my stepfather was a brutal, ignorant sadist who physically intimidated his coworkers and had paranoid delusions that his superiors were persecuting him. The daily ritual of his homecoming — the roar of his Chevy pulling up the driveway, the click of the parking brake, the heavy footfalls, the screen door screeching open — always filled me with dread. There was a brief, blissful hour between my delivery from school and his arrival, and I spent most of that time steeling myself for whatever foul mood was about to step through that door.

He despised his job. The so-called dignity of blue collar work, that ethereal substance peddled by countless T.V. pundits the world round, always eluded him. Every day brought some fresh proletarian grievance, some monstrous tale of managerial incompetence or abuse. Thankfully, my own path would be rather different — a gifted student, I was destined to escape my working class roots. My stepfather regarded my academic interests with a deep and abiding suspicion. It may be that he felt intimidated by intellectual pursuits generally, but all I remember was that he seemed to consider them little more than pretentious twaddle. The only valuable skills were the sturdy, practical ones — replacing a fan belt in the car, or building a deck in the backyard. Everything else was the province of sissies.

By the time I left home to attend MIT, he had more or less accepted the reality that my academic achievements had opened doors he would never have attempted himself. I think he was embarrassed by the extent of the financial aid I received; my family had essentially no assets to speak of, and could ill afford the expense. If not for MIT’s largesse to indigent students such as myself, there is simply no way I could have matriculated there. Though he expressed some surprise at their generous investment in my education, there was, I think, a faint tincture of pride in his astonishment.

I never discovered what precisely he thought about the whole situation, though, because I hardly ever spoke to him again. There were the Christmas visits, of course, but most of that time was spent with friends. I had little desire to dredge up old memories, and as far as I was concerned, there was nothing of substance for us to discuss. My mother divorced him while I was in graduate school. My brothers tell me he became a pathological hoarder after that, a tendency my mother had somehow kept in check throughout twenty-five years of marriage. Left to his own devices, he descended even further into bitterness and paranoia, an old man lost in a six-foot high maze of his own junk.

The other night, while surfing through the premium movie channels, I came upon Primary Colors. There on the screen, undimmed by the passage of twenty years, was Jack Stanton’s shining speech to the factory workers.  I found myself imagining my stepfather in the audience. What would he have thought about Stanton’s plan? Would the prospect of lifelong education have stirred a passion for self-improvement from the depths of his soul?

I realized in that moment that Stanton was wrong. Education is not the path to salvation. And it’s not the answer to unfettered capitalism. It’s just another way for politicians like Stanton to blame the victims of an exploitative system.

Granted, there is real value to vocational training, especially if you want to enter the skilled trades. But it’s unrealistic to expect huge swaths of your workforce, many of them older and set in their ways, to pursue an entirely new skill set at the drop of a hat. It’s also unrealistic to expect a huge proportion of the population to pursue a college education just so they can participate in the so-called “knowledge economy.”

Hell, my stepfather barely graduated high school. Some people just aren’t temperamentally suited to higher education. It’s not a defect, and they shouldn’t be punished for it. But the modern American economy wants coders, data scientists and program managers. What about the blue collar guy, the guy who just wants to make a decent living and do something useful with his hands? He’s not an entrepreneur and he doesn’t want to be. He wants to live in one place, put down roots, and participate in his local community. Tech workers may look down on this attitude — after all, many of us roamed far and wide in search of education and employment. But that’s not entirely fair. There are many different kinds of people, and there are many different ways to contribute to society.

I don’t have any answers to this particular conundrum. I’m just musing.

Reflections on a Turbulent Year

by Ribbit

Another year gone. One of the longer ones in living memory, at least for me. Picking through the trash heap of events like an old crow searching for morsels of meaning, I find myself wondering what the heap signifies, beyond being merely an accumulation of a year’s worth of crap. It’s way too much for me to sort through on my own. Of course, there are other crows picking at other parts of the pile, wondering much the same things.  And, like me, finding little to satisfy their hunger.

I suppose if I had to give the heap a name, it would be something like, “The year that Trump’s support should have collapsed, but didn’t.” Unfortunately, this rather unwieldy moniker applies equally well to either of the two preceding years, so it lacks a certain precision. A better name might be, “The year the Democrats took back the House, Trump completely lost his shit, and his base supported him anyway.”

I’ve become an inveterate lurker at 538.com, watching as Trump’s approval numbers hover steadily above 40%, sometimes surging a bit in response to some outrageous act, sometimes sagging in response to another, but seldom dipping below that forty percent floor. It’s like watching a brick float on air, day after day, in brazen defiance of gravity. If only we could harness this apparent violation of Einstein’s field equations, but alas, political gravity follows different rules.

On a personal level, I suppose this is the year I finally came to accept the wisdom of the claim, advanced most recently by the philosopher John Gray, that the concept of moral progress is a phantom. Civilizations never truly banish social ills — those ills merely rearrange themselves and adopt new names. I don’t think I’ve descended to the point of abandoning the fight to eliminate them, but I do admit to a keener edge of cynicism in my approach to societal problems generally.

American society has an enormously high tolerance for predatory and exploitative behavior. We use every available means to excuse it; we do little, if anything, to curb it; and we blame the victim whenever it occurs. Still, one might object that we’ve made serious strides since the days of chattel slavery, and that’s certainly true in some respects. For instance, today we would deem the kinds of physical abuses to which slaves were subjected horrifying and unacceptable. But in most other respects, little has changed. Look at our enormous prison population, or the plight of powerless migrant workers, many of whom are treated quite cruelly by any reasonable standard.

While it’s true we abolished slavery proper, we did so only belatedly, and only after a protracted civil war that claimed more American lives than all of our other conflicts combined. So the concept was already deeply lodged in the American psyche long before we bloodied ourselves to end it formally. It seems difficult to imagine a time when people actually defended slavery as an institution, and that fact alone certainly feels like progress. But all that has happened, really, is that slavery itself has undergone certain structural changes intended to make it more palatable to modern sensibilities. For one thing, we no longer call it slavery. But more substantially, it has evolved into something impersonal and systemic, and the ideologies that support it have grown more sophisticated. Now it is perceived as an unavoidable byproduct of the present capitalist arrangement, an arrangement that brings prosperity and poverty in equal measure, but without which — it is argued — there could be no prosperity at all. As such, there is no one to be blamed, and nothing to be done; for to question the American mode of capitalism is to question the very foundation of our way of life.

This is why I do not believe we will make substantive progress on most of the pressing issues of our time, issues like climate change, poverty, obesity, alternative energy, or the continuing lack of affordable healthcare coverage for many millions of Americans. So long as I live in a country where at least a third of the voting population continues to support a man like Trump, so long will I despair of true progress, moral or otherwise.

I hate to sound a sour note on New Year’s Eve, but honestly, is there any other way to feel?