Fredster’s Millions

By Croak,

Trump is making strong inquiries into the purchase of Greenland from Denmark, a move that is somehow both surprising and predictable.  For what could be more Trumpian than overpaying for a piece of property that the original owner is struggling to maintain just to emboss his name in gold on it?

I’m going to offer an alternative reality that makes as much sense as our current one:  Donald Trump has been, in fact, living a real life reboot of the mostly forgotten 1985 film Brewster’s Millions.

The movie starred Richard Pryor as Montgomery Brewster, sole heir to a wealthy but eccentric relative who in his will stipulated that Monty would inherit $300 million only if he could spend $30 million within 30 days.  Of course, there were restrictions, Monty could: tell nobody about the bizarre stipulation, lose only a certain percent to gambling and charity, and keep none of the assets purchased with the $30M.

The drama in the movie centers around Monty’s attempts to burn through money while retaining nothing: renting expensive cars and clothes, hiring out the Yankees for an exhibition game, getting into bad investments.  Monty even runs for mayor of New York and is about to win when he decides to pull out or otherwise face a $60,000 “penalty” in taking on mayoral salary.

It’s unfortunate the original movie wasn’t a success, though its current Rotten Tomato score, which sits below 40 percent, may be harsh.  The story that Brewster’s Millions could tell today is just how hard it is to become not rich once a person is rich.  Think of all the potential “horrible investments” that sound like actual successful (at one time, at least) tech companies: an app that only messages “hey bro”, a machine that performs dozens of medical tests from a single drop of blood, a purely digital currency.

If my alternate reality were true, we would now be at the part of the movie when Monty invests in iceberg towing, a scheme that involves pushing icebergs from near the north pole to be sold at the Arabian desert, an idea which has been floating around for more than a century and is seriously being considered now as a solution for drought.

Of course, our main character, Donny, is the living embodiment of how the rich can’t get poor no matter how incompetently he manages.  And, of course, any modern reboot needs to subvert expectations.  Hence, in our version, Donny is already president, and rather than selling the icebergs to Arabs, he will probably give them away to the House of Saud in exchange for the rights to a new Trump Tower in Riyadh.  For if Trump is to succeed in losing enough money to inherit even more of Fred’s secret millions, what better way than to “invest” in another Trump property?

Bellum Omnium Contra Omnes

by Ribbit

Enough already.  Enough with the pretense of the sportsman’s semiautomatic rifle, as if it were the harmless plaything of the weekend warrior, or the needful implement of the noble hunter.  The AR-15, and other rifles like it, is neither of these things.  It is an instrument of war.

Yes, the AR-15 can be used for hunting.  So can a bolt-action rifle.  The former was specifically designed for infantry combat.  The latter was not.  Nobody is calling for a ban on bolt-action rifles, just as nobody is calling for a ban on cars, steak knives, or baseball bats.  These latter items were designed with specific uses in mind, none of them peculiar to soldiering — unlike the AR-15, which has no other legitimate purpose, and was never intended for civilian use.  Even Eugene Stoner’s family admits as much.  We already forbid the civilian ownership, sale, and possession of machine guns, which suggests a bygone recognition that the radiance of the Second Amendment extends only to some definite limit.  (Yes, I am aware that technically, you can legally own a machine gun manufactured before May 1986.  But the hurdles, both in cost and investigation, are very high indeed.  And the prohibition is otherwise absolute.)

The historical provenance of the Second Amendment — so ably discussed here — has long since lapsed out of memory.  It is a fossil buried deep within our founding mythology, invisible to modern sensibilities.  It’s high time that fossil was excavated, and exhibited to public view.  Its revelation would do little to persuade the paranoids, I’m sure, but it would provide some much needed context for debates about the meaning and application of this amendment.

Maybe that’s too hopeful.  Maybe rational public debate about controversial issues is no longer possible.  That is certainly true for the extremists, who fetishize military-grade weaponry and will never cede so much as a hairsbreadth on that score.  But happily, democracy does not require that we persuade everyone, and a substantial majority of our fellow citizens already believe that more restrictive gun legislation is both necessary and proper.  The problem isn’t so much a lack of public will, as the absence of its representation within Congress.  That is partly by design, of course — the Founders mistrusted fluctuations of public sentiment — but it is also because people like Mitch McConnell and his ilk have worked so very hard to enshrine minority rule at all levels of government.

Opponents of gun control like to point out that the vast majority of gun owners are law-abiding citizens.  That is certainly true, but it is also irrelevant.  More relevant is the easy availability among civilians of weapons that properly belong only in the hands of soldiers.  We already restrict ownership of hand grenades, and nobody is clamoring for those, just as nobody is clamoring for the free sale of operational tanks or anti-aircraft ordnance.  Why is it, then, that a vocal minority in this country has become so entirely fixated on possessing military configurations of semi-automatic rifles?  Perhaps that question answers itself.  These weapons are unnecessary for self-defense or hunting; indeed, they can have but one purpose.  I, for one, believe that purpose should be limited to the application of military force against our enemies during wartime.  That others feel differently, and justify their radicalism by appeal to paranoid fantasies of paramilitary insurrection, is frankly rather frightening.

I am well aware that the legal prohibition of so-called “assault weapons” would not stop all tragedies like the ones in Dayton and El Paso.  But it would undoubtedly help.  It would remove one of the easier options from a prospective killer’s repertoire, and deter the merely impulsive, deranged, or opportunistic, whose rages tend to be situational and transient.  Opponents of such legislation impose an impossible standard by insisting that it must end all killing everywhere at once; if that is the measure, then we may as well do away with our most ancient and venerable prohibition against murder itself, which is surely one of the least successful laws ever devised.

In any case, I don’t recall anyone raising such objections when the government finally started regulating the sale of ammonium nitrate.  People could see the logic then.  Now, they shut their eyes.  They shut their eyes against images of unutterable suffering and grief, because they know such images form an irrefutable challenge to their peculiar ideology.  Forced to choose between human feeling and inhuman conviction, they cling ever more desperately to the latter.

The calculus is simple.  We stand to gain far more in human life, than would be lost in liberty.  Let the scales of judgment tilt towards pity.

Canonical Cant

Grievous acts of lexical buzzkill, recorded.

by Ribbit

Corporate brand, n.

A numinous totem of the C-Suite mystery cult.  Like the Trinity, it embodies irreconcilable contradictions among inscrutable concepts.

Corporate lawyer, n.

Plausible deniability in human form.

Corporate officer, n.

An extravagantly priced pneumatic apparatus for recirculating stale air.  Frequently installed to ventilate board rooms and business class cabins.

Corporate strategy, n.

1. Reasoning backwards from shareholder prosperity to workforce penury.  Typically the purview of the Chief Financial Officer.
2. The name given to officer-level meetings involving metaphysical discussions of the corporate brand.
3.  (archaic) Long-term planning for the purpose of achieving optimal allocation of corporate assets and avoidance of obstacles to success.

See also: Make-work

Press release, n.

Disclosure calculated to conceal.

Soundbite, n.

The smallest possible quantum of mendacity.

Public Relations, n.

The art of dodging and lobbing in one smooth motion.

See also:  Press release, soundbite, shell game

Performance review, n.

Praise followed by parsimony.

See also:  Ritual scarification.

Small talk, n.

An exclusive claim laid to silence owned in common.

Human Resources, n.

Deadwood charged with holding the corporate matchstick.

Fox News, n.

Undead air. It hungers for what its prey can scarcely provide.

See also:  Zombie apocalypse, conservative talk radio.

White paper, n.

The fossilized remains of a previous deadline.

See also:  Doorstop, shelfware, flyswatter.

Elevator pitch, n.

Compressed air released in an enclosed space.

Extroversion, n.

A vivid camouflage rendering the wearer visible to all but himself.  Essential gear for executives, salesmen, and politicians, who must scrupulously avoid falling prey to introspection.

Introversion, n.

A form of leprosy symptomized by inward reflection and an outward aversion to frivolous speech.

Optimism, n.

An account credited from the arrears of reason.

Pessimism, n.

Optimism that has paid its debts, often with considerable interest.

Org Chart, n.

A map to trace the lineage of ill-conceived notions and misbegotten plans.

Agile development, n.

Navigating cyclones by means of the raised and wetted finger.

Reduction in force, n.

The ritual expiation of management sins through the periodic expulsion of a few village goats.

See also:  Layoffs, downsizing, corporate strategy.

Subcontracting, n.

Incompetence by proxy.

Management consulting, n.

Error by proxy.

Management consultant, n.

One who delivers reports to his employer in his employer’s handwriting.

Pivot, v.

Retain investment capital by discovering new ways to spend it.  The principal mode of innovation at startups.

Internet of Things, n.

An international cybersecurity employment initiative.  It calls for the introduction of new vulnerabilities into old appliances, by means of code written expressly for that purpose.

See also:  #NotMyToaster, #NotMyRefrigerator, #NotMyCoffeePot

Data science, n.

A category error masquerading as statistics.

See also:  Big Data, machine learning, computational astrology

Intellectual property, n.

The legal sanction given for the extraction of rent from one criminal enterprise by another.

See also: Nathan Myhrvold

TV Cult: Eat, Drink, and Be Weary for Tomorrow We Sigh

by Croak,

As Game of Thrones nears its climatic battle at Winterfell, the first two episodes of Season 8 has been filled with reunions and emotional setup, which can only mean that fan favorites are sure to die in the coming episodes. A formula true to the buzz seeking ethos of the show in its later stages.

Early in the GoT‘s run and staying true to its source material, the show had genuinely shocking moments. Children tossed from towers, heads rolled from noble heroes, blood poured from weddings. Nobody was safe from the carnage, but the carnage was well calibrated by the careful planning of George RR Martin. The fates of characters we loved and hated were written into a cosmic astrolabe, hard to predict but possessing a beautiful natural order, as if the plot were governed by historical laws the reader didn’t possess. If the events seemed shocking, the fault was in our geocentric model of the universe where the heads of noble heroes are not mounted on spikes to be sneered at by pissy teenagers. Martin didn’t subvert expectations simply for shock value, fantasy realism was the core of his opus, and the consequences fit their actions.

Since Season 4, D.B. Weiss and David Benioff–GoT‘s showrunners–began their long wobble away from Martin’s books: Ramsay Bolton rose as a major villain, the Martell story lines were minimized (and the leftovers made into a bad cartoon), background characters such as Bronn were moved to the foreground, Lady Stoneheart was dumped.  The show grew increasingly reliant on spectacles to create Twitter chatter, trying to mimic the natural shocks that Martin wrote so deftly.

Some like Hodor’s death had a touch of genius, others like Hardhome and the Battle of the Bastards were at least well-crafted even if illogical, but yet others like the destruction of the Sept of Baelor, Aria’s battle with the Waif, the death of Ramsay Bolton and Littlefinger were sloppy affairs that pandered to the creation of Big Events.

“A Knight of the Seven Kingdoms” showed how hollow the show has become. “Last night on Earth” tropes are an opportunity to uncover the true nature of characters, for them to reflect upon their journey and to feel the very texture of their existence; when done right, they burn with life. Yet, Episode 2 was laden with an hour of back-to-back conversations full of rue, people sitting in darkened rooms brooding over their fates.  Even the show’s namesake scene–a scene with great symbolic potential–was slathered in an unhealthy serving of treacle.  Compare the episode with the dynamism of the Red Keep scenes in “Blackwater” (Season 2, Episode 9), thick with tension and intrigue, still setting the ragged edges of its characters. Now, the show holds about as much suspense as the NCAA tourney, complete with office pools betting on who will survive and who will sit upon the Iron Throne.

The show’s recent failings stem from a lack of interest in the intricate machinery that made the stars dance in earlier seasons, in understanding the subtle interplay between character and the course of events, action and reaction; instead, it has attempted to mimic earlier outcomes by relying on shock and the creation of Twitter-worthy memes. Episode 3 is likely to produce many more of them with the Night King and the army of the undead come to besiege Winterfell. The invaders may find, however, that Westeros is already occupied by zombies.

A Failure Wrapped in a Folly inside a Farce

by Ribbit

There’s been a spate of recent articles accusing progressives like Elizabeth Warren of rank partisanship for proposing the radical and dangerous idea that maybe, just maybe, we should dismantle that ridiculous anachronism called the Electoral College.  I don’t bat an eye at the hypocrisy on display — it comes with the territory — but I do have to wonder at some of the reasons these so-called “intellectuals” supply in defense of this undemocratic monstrosity.

I hold to the view that an election is about discovering, and enacting, the will of the people; and that overturning this will, which is the true wellspring of any legitimate government, requires some rather strong justifications.  Most of the opposition literature I’ve read fails to surmount even the most basic hurdles in this regard, relying as it does on an absurd application of the notion of “state’s interests.”  This concept is vital even among avowed libertarians like Kevin D. Williamson, for whom such collective categories ought to be essentially meaningless.  From what I can gather, a proper consideration of state’s interests is supposed to prevent a catastrophe known as the “tyranny of the majority.”  This means conferring four times the voting power on people living in Wyoming as those living in California, thereby halting our collective lapse into that majoritarian horror show in which power-mad urbanites subdue helpless country folk to their arbitrary will.  There’s no real explanation for how this subjugation is supposed to work — it’s not as if Californians would have any say about where or whether some Wyoming municipality builds its bridges — but the image conjures all the requisite fears in the minds of its intended audience.

I have to say, this is one of the strangest and most perverse arguments I’ve heard in a long while.  It’s a direct descendant of the kinds of rationalizations mustered by fusty antebellum Southerners in defense of their “way of life” below the Mason Dixon line.  In fact, the only real alternative I can see to the “tyranny of the majority” is its literal opposite, the enshrinement of one or another species of minority rule, which is surely the less defensible option.  I agree that, at least regarding important decisions affecting the entire body politic, narrow majorities are less desirable than large ones, but that should make outright minority rule even more repugnant.  Yet minority rule is precisely what people like Williamson are defending.

The office of the president is a national office.  She represents the nation as a whole, not some parochial subdivision, so I can’t see why we should privilege the least populated states in the process of electing her.  To invoke a faintly libertarian argument, the people have voted with their feet.  They’ve decided they would rather live in New York than Alaska.  It makes no sense to punish them for making that choice; after all, they’re only doing the rational thing, by migrating to where the opportunities are.  Shouldn’t we be encouraging them to do that, instead of rewarding the losers who languish in some economic backwater like Arkansas or Kentucky?  Why bestow such disproportionate political power on people who consistently choose stagnation over dynamism, failure over opportunity?  I’m not a libertarian, so I can’t seriously endorse this argument, but it makes about as much sense to me as anything else they’ve ever said.

Consider that the founding fathers intended the Electoral College as a bulwark against precisely the outcome of the last election, in which a dangerous demagogue ascended to the nation’s highest office.  Consider also that it made that outcome possible in the first place by robbing the majority of their own, infinitely wiser, choice.  Failure, thy name is Article II, Section I.

I know, I know.  Warren’s proposal for a constitutional amendment abolishing this idiocy signals her totalitarian ambitions, her contempt for tradition and the rule of law, her radical zeal for corrupt mobocracy — or something.   I’m sure she’d be twirling her moustache if she had one.

Film Cult: The Moral Maze of Chinatown

by Ribbit

“Forget it, Jake.  It’s Chinatown.”

So says Walsh to his boss Jake Gittes, the private investigator at the center of Roman Polanski’s eponymous masterpiece.  That final line has provoked much thoughtful commentary in the years since the movie first opened in 1974.  Is Chinatown a “state of mind”?  A symbol of irredeemable corruption?  Or merely a personal reference to an unfortunate incident in Jake’s own past?  It is all of these things, and more.

The story has all the lineaments of classical Greek tragedy:  the dogged hero in pursuit of truth, whose actions bring about the very horror he is striving to prevent.  Perhaps the only real quibble with this claim is that a fifth century Greek would have known precisely which horrors awaited the protagonist, whereas Chinatown hews exclusively to Gittes’s limited point of view.  There’s no real sense of dramatic irony here; we know nothing Jake doesn’t.  The audience muddles alongside him as he struggles to weave emerging facts into an increasingly sprawling web of conjectural narrative.

Of course, it’s not just the facts themselves, but their proper interpretation that eludes him.  He discovers clues, but lacks the context to give them meaning.  He interrogates various players in the narrative, but can’t understand what they are saying.  A key piece of sartorial evidence shimmers provocatively in front of him right from the beginning, but he is unable grasp its significance until far too late.  He spends most of the movie trapped in a kind of semantic limbo, and purely for want of the right kind of imagination — his basic decency prevents him from speculating beyond the bounds of conventional venality.

This incapacity makes him rather ill-suited to confront the peculiar quality of evil embodied by Noah Cross.  It’s true he deduces, correctly, that Cross himself orchestrated the scheme involving the Water Department, but he is tardy in understanding the nature of the quarrel between Cross and his former business partner Hollis Mulwray.  He also deduces, also correctly, that Mulwray discovered the scheme, and quite naturally assumes that Cross murdered him for it, or else that his wife Evelyn did it herself out of jealousy for his supposed infidelities.  Of course, neither of these dueling interpretations proves out.  The script is ingeniously constructed to sustain a maximum of ambiguity at every step, so much so that each of Jake’s hurriedly constructed revisions of his provisional narrative seem perfectly plausible and unobjectionable, as do his consequent actions.

But it’s these very actions, well intentioned as they are, that bring about the final tragic events of the film; therein lies its quality of “Greekness,” its pervasive sense of a fated outcome. The irony of this outcome derives not from any foreknowledge we might have of it, but from our approval of the interventions that produced it.  The core theme of the film can be found in an earlier conversation between Jake and his employer Evelyn Mulwray, a desultory, post-coital exchange about Jake’s time as a cop working the Chinatown beat.  “You can’t always tell what’s going on there,” he says, before proceeding to explain that he had once tried to protect one of its residents from an unspecified danger, only to wind up causing her harm.  This is as close to prophecy as we will come in this film, and it is uttered not by some oracle or seer, but by the protagonist himself, the very man for whom the web of intrigue surrounding him is the most inscrutable.

The movie hinges on the epistemological gulf separating facts from interpretations — things as they are, versus what those things mean.   These two concepts form a closed circle.  Facts are chemical agents whose meaning emerges only in combination; they cannot speak for themselves.  And our interpretations depend rather essentially on the order in which we acquire those facts.  Jake was in possession of many facts, even some of the most important ones, but they were useless to him — he lacked the one crucial detail that would have brought the others into focus.  The trouble is, there was no way for him to know with any certainty that his mental picture was incomplete.

This presents Jake, and by extension, all of us, with an impossible conundrum:  how to act morally, when the consequences of our choices fork away from us towards dark and unknowable ends.  His own solution, refined through years of navigating ambiguous situations with duplicitous clients, is adherence to a personal code.  Yes, he is willing to cross certain lines — after all, he sleeps with a client, and well before he is certain of her innocence — but he never abandons his own core sense of right and wrong.  He indulges some petty subterfuge, but he never does anything outright immoral in his quest for the truth.  And he never allows his loyalties to interfere with his ethical obligations.

Perhaps the closest analogue in Greek myth is the story of the Minotaur.  Jake’s personal code is like a thread in the narrative labyrinth, guiding him, hand over hand, towards an outcome he cannot see.  The monster at the center of this labyrinth is a man of obscure intention and implacable will, and there is nothing Jake can do to stop him.  If moral systems are intended to help guide us towards right action, this movie suggests that there are situations where even the most conservative such systems — systems founded on duty rather than consequence — can completely fail us, or worse, lead us to injure the ones we love.

One might object that the horrifying conclusion of the film has more to do with Jake’s precipitousness than with any deficiency in his moral system, and there is some justice in this claim, but consider also that many of his choices are made under the duress of exigency.  Each revelation seems to demand an urgent response, and Jake is not the kind of person to ignore such demands.  His essential character is oriented towards action, industriousness, and justice.  In the Western tradition, these are usually considered virtues, but the pervasive corruption of the Los Angeles milieu perverts these virtues into catalysts of horror and destruction.  This is the tragedy of a man who is so desperate to do good, he becomes the unwitting instrument of an unspeakable evil.

The cinematography provides an ironic counterpoint to the film’s subterranean themes by drenching most of the action in California sunshine.  The ripe glow of an orange grove becomes an occasion for suspicion and misplaced animus.  Water sparkles under golden skies as sodden corpses are dragged ashore.  Plots and schemes proceed in glints and gleams.  This is not a movie of dark interiors or cloudy climes, for the splendor obscures a deeper obscurity:  in Chinatown, what is visible is also inscrutable.  Illusions must be seen in order to beguile, and the substance of mirages is light and air.

Hard Labor

by Ribbit

There’s a scene in the 1998 film Primary Colors where the protagonist, a philandering Democratic presidential candidate named Jack Stanton, addresses a union of troubled factory workers. “Muscle jobs go where muscle labor is cheap,” he tells the skeptical crowd, “and that is not here. So if you wanna compete, you’re gonna have to exercise a different set of muscles. The one between your ears.” These words elicit rapturous approval from Jack’s idealistic campaign manager, Henry Burton, who is eager to assuage his burgeoning doubts about his boss’s character.

Back when I first watched the film, I felt much as Henry did — enthralled by the allure of a dream. I was a twenty-three year old graduate student in hot pursuit of a career in physics.  I believed in the power of education. I believed in the verity of the Enlightenment values I had absorbed as a child — values like intellectual freedom, moral progress, empiricism, tolerance, and individual worth. Jack’s speech is built on these values. He proposes lifelong education and self-betterment as the proper path to economic salvation; indeed, as the most effective solution to the perennial problems of capitalism itself.

At the time, I would have agreed with him. But that was twenty years ago. I suppose my views have changed since then.

I was reared in what might charitably be called a working class neighborhood. When I was eleven or twelve, my stepfather received a discharge from the army — on medical grounds, ostensibly, though there may have been more to it than that. After nearly twenty years of service as a non-commissioned officer, this was a sudden and traumatic rupture for him, and he struggled to find work. He ultimately landed a job as a bus driver, and later as a high school janitor. Lest this latter fact conjure images of that wry and witty broom-pusher from The Breakfast Club, I should also mention that my stepfather was a brutal, ignorant sadist who physically intimidated his coworkers and had paranoid delusions that his superiors were persecuting him. The daily ritual of his homecoming — the roar of his Chevy pulling up the driveway, the click of the parking brake, the heavy footfalls, the screen door screeching open — always filled me with dread. There was a brief, blissful hour between my delivery from school and his arrival, and I spent most of that time steeling myself for whatever foul mood was about to step through that door.

He despised his job. The so-called dignity of blue collar work, that ethereal substance peddled by countless T.V. pundits the world round, always eluded him. Every day brought some fresh proletarian grievance, some monstrous tale of managerial incompetence or abuse. Thankfully, my own path would be rather different — a gifted student, I was destined to escape my working class roots. My stepfather regarded my academic interests with a deep and abiding suspicion. It may be that he felt intimidated by intellectual pursuits generally, but all I remember was that he seemed to consider them little more than pretentious twaddle. The only valuable skills were the sturdy, practical ones — replacing a fan belt in the car, or building a deck in the backyard. Everything else was the province of sissies.

By the time I left home to attend MIT, he had more or less accepted the reality that my academic achievements had opened doors he would never have attempted himself. I think he was embarrassed by the extent of the financial aid I received; my family had essentially no assets to speak of, and could ill afford the expense. If not for MIT’s largesse to indigent students such as myself, there is simply no way I could have matriculated there. Though he expressed some surprise at their generous investment in my education, there was, I think, a faint tincture of pride in his astonishment.

I never discovered what precisely he thought about the whole situation, though, because I hardly ever spoke to him again. There were the Christmas visits, of course, but most of that time was spent with friends. I had little desire to dredge up old memories, and as far as I was concerned, there was nothing of substance for us to discuss. My mother divorced him while I was in graduate school. My brothers tell me he became a pathological hoarder after that, a tendency my mother had somehow kept in check throughout twenty-five years of marriage. Left to his own devices, he descended even further into bitterness and paranoia, an old man lost in a six-foot high maze of his own junk.

The other night, while surfing through the premium movie channels, I came upon Primary Colors. There on the screen, undimmed by the passage of twenty years, was Jack Stanton’s shining speech to the factory workers.  I found myself imagining my stepfather in the audience. What would he have thought about Stanton’s plan? Would the prospect of lifelong education have stirred a passion for self-improvement from the depths of his soul?

I realized in that moment that Stanton was wrong. Education is not the path to salvation. And it’s not the answer to unfettered capitalism. It’s just another way for politicians like Stanton to blame the victims of an exploitative system.

Granted, there is real value to vocational training, especially if you want to enter the skilled trades. But it’s unrealistic to expect huge swaths of your workforce, many of them older and set in their ways, to pursue an entirely new skill set at the drop of a hat. It’s also unrealistic to expect a huge proportion of the population to pursue a college education just so they can participate in the so-called “knowledge economy.”

Hell, my stepfather barely graduated high school. Some people just aren’t temperamentally suited to higher education. It’s not a defect, and they shouldn’t be punished for it. But the modern American economy wants coders, data scientists and program managers. What about the blue collar guy, the guy who just wants to make a decent living and do something useful with his hands? He’s not an entrepreneur and he doesn’t want to be. He wants to live in one place, put down roots, and participate in his local community. Tech workers may look down on this attitude — after all, many of us roamed far and wide in search of education and employment. But that’s not entirely fair. There are many different kinds of people, and there are many different ways to contribute to society.

I don’t have any answers to this particular conundrum. I’m just musing.

Reflections on a Turbulent Year

by Ribbit

Another year gone. One of the longer ones in living memory, at least for me. Picking through the trash heap of events like an old crow searching for morsels of meaning, I find myself wondering what the heap signifies, beyond being merely an accumulation of a year’s worth of crap. It’s way too much for me to sort through on my own. Of course, there are other crows picking at other parts of the pile, wondering much the same things.  And, like me, finding little to satisfy their hunger.

I suppose if I had to give the heap a name, it would be something like, “The year that Trump’s support should have collapsed, but didn’t.” Unfortunately, this rather unwieldy moniker applies equally well to either of the two preceding years, so it lacks a certain precision. A better name might be, “The year the Democrats took back the House, Trump completely lost his shit, and his base supported him anyway.”

I’ve become an inveterate lurker at 538.com, watching as Trump’s approval numbers hover steadily above 40%, sometimes surging a bit in response to some outrageous act, sometimes sagging in response to another, but seldom dipping below that forty percent floor. It’s like watching a brick float on air, day after day, in brazen defiance of gravity. If only we could harness this apparent violation of Einstein’s field equations, but alas, political gravity follows different rules.

On a personal level, I suppose this is the year I finally came to accept the wisdom of the claim, advanced most recently by the philosopher John Gray, that the concept of moral progress is a phantom. Civilizations never truly banish social ills — those ills merely rearrange themselves and adopt new names. I don’t think I’ve descended to the point of abandoning the fight to eliminate them, but I do admit to a keener edge of cynicism in my approach to societal problems generally.

American society has an enormously high tolerance for predatory and exploitative behavior. We use every available means to excuse it; we do little, if anything, to curb it; and we blame the victim whenever it occurs. Still, one might object that we’ve made serious strides since the days of chattel slavery, and that’s certainly true in some respects. For instance, today we would deem the kinds of physical abuses to which slaves were subjected horrifying and unacceptable. But in most other respects, little has changed. Look at our enormous prison population, or the plight of powerless migrant workers, many of whom are treated quite cruelly by any reasonable standard.

While it’s true we abolished slavery proper, we did so only belatedly, and only after a protracted civil war that claimed more American lives than all of our other conflicts combined. So the concept was already deeply lodged in the American psyche long before we bloodied ourselves to end it formally. It seems difficult to imagine a time when people actually defended slavery as an institution, and that fact alone certainly feels like progress. But all that has happened, really, is that slavery itself has undergone certain structural changes intended to make it more palatable to modern sensibilities. For one thing, we no longer call it slavery. But more substantially, it has evolved into something impersonal and systemic, and the ideologies that support it have grown more sophisticated. Now it is perceived as an unavoidable byproduct of the present capitalist arrangement, an arrangement that brings prosperity and poverty in equal measure, but without which — it is argued — there could be no prosperity at all. As such, there is no one to be blamed, and nothing to be done; for to question the American mode of capitalism is to question the very foundation of our way of life.

This is why I do not believe we will make substantive progress on most of the pressing issues of our time, issues like climate change, poverty, obesity, alternative energy, or the continuing lack of affordable healthcare coverage for many millions of Americans. So long as I live in a country where at least a third of the voting population continues to support a man like Trump, so long will I despair of true progress, moral or otherwise.

I hate to sound a sour note on New Year’s Eve, but honestly, is there any other way to feel?

Excelsior!

by Ribbit

I would be remiss if I allowed the passing of Stan Lee and William Goldman to go unremarked. Each of these men, in their different ways, had a profound effect on my life, and indeed, on the culture generally. They moved through relatively obscure orbits in the cultural firmament, but their influence could be felt everywhere, tugging the brighter stars away from their traditional spheres and towards more complex, unsentimental modes of storytelling.

To begin with Mr. Lee: I was never much of a comic book aficionado, but I do admit to having a more than casual familiarity with the Marvel titles of my era — Spider-Man, the X-Men, the Hulk, Iron Man, and Doctor Strange. The Fantastic Four was one of my particular favorites, if for no other reason than its peculiar conceit that an unreconstructed nerd from the Sputnik era could lead a premiere team of superheroes. Popular fiction had never featured scientists who weren’t moustache-twirling villains, much less heroic leaders, so the idea that they could be more than mere helpmeets or foils to the real protagonist was truly refreshing.

The other titles were also innovative for their time. Spider-Man juggled adolescent problems while contending with some of the more idiosyncratic villains of the Marvel universe (and delivering creditable quips along the way). The X-Men were gifted but misunderstood teenagers yearning for acceptance. The Hulk was an atomic age Jekyll and Hyde story reflecting popular ambivalence about the progress of science. Iron Man explored the perils of the military industrial complex through the eyes of an alcoholic Cold Warrior. And Doctor Strange wedded New Age psychedelia to a high fantasy conceit. Comic books had never seen narratives quite like these, or taken the personal and psychological lives of its characters so seriously. Stan Lee changed all that.

When all is said and done, Lee had about as much influence on popular culture as George Lucas or Stephen King. He may never have achieved their level of fame, but his characters sure did. In fact, they now constitute the bulk of the 21st century film industry, a situation that would have been unthinkable only a decade ago. His work is a testament to what can be achieved creatively even within a corporate context — at least under the right circumstances. (I suppose it is also a testament to how those same corporate structures can flog a good concept to death.)

William Goldman, by contrast, wrote many of his scripts on spec. He lived in New York and was never much of a player in Hollywood. A trenchant and pithy writer, he quickly became skeptical of the corporate studio system, whose fashions and neuroses he regarded with mordant cynicism. In his memoirs he observed that despite heroic efforts on the part of executives to discover reasons for the success or failure of particular films, no reliable formula has ever been found — a circumstance pithily expressed in what became his second-most famous epigram, “Nobody knows anything.” Nobody can predict which movies will work, and which will bomb — it’s all a kind of high stakes crapshoot. This hasn’t stopped anxious executives from slavishly imitating past successes, of course, with the result that moviegoers have come to regard themselves more as consumers than as patrons of the art.

Goldman’s most memorable films include Butch Cassidy and the Sundance Kid, The Princess Bride, and All the President’s Men. Think how different these three films are from one another: a Western featuring two charismatic stars who flee rather than fight; a sly fantasy filled with imaginative whimsy and ironic asides; and a slow political thriller about reporters uncovering a White House conspiracy. He had a way of discovering the hidden possibilities of almost any story, even the stuff he found very difficult to write.   By his own account, All the President’s Men was a horrible slog, a morass of names and dates; still, out of that mess of complexity he managed to pluck “Follow the money,” a phrase that has transcended its original context to become an enduring emblem of American political corruption.

Farewell, gentlemen. You will both be missed.

The Scorpion and the Frog

by Ribbit

Over the last few years, I have labored mightily to discover some governing principle, or barring that, some microscopic particle of scruple, lying beneath what might charitably be called the “conservative worldview.”  But I have failed to discern anything beyond what is plainly evident to any person of good sense; namely, a toxic compound of hatred, contempt, and lust for power.

The object of this hatred is merely an ideological expediency, and varies according to fashion.  Communists, abolitionists, pornographers, homosexuals, immigrants, progressives — the precise name matters not.  What matters is their ceremonial function within the conservative ritual of purgation:  to embody the necessary impurity whose expulsion brings about the mystical renewal of a Golden Age.  Mitch McConnell is the current high priest of this Trumpian atavism, feeding the burnt remains of the body politic to his constituents while continuing to stoke the sacrificial fire.  And the masses devour these sacraments greedily, savoring each vicarious morsel of power with all the passion of the fetishist.

It is this fact that so perplexes and exercises the liberal mind.  Liberals cannot conceive of power as anything but an instrumentality.  They labor under the delusion that Americans are united in their respect for the institutions and organs of republican government.  They imagine, for example, that it is possible for the Supreme Court to be “tarnished” by Kavanaugh’s elevation to that august body, or that it somehow compromises the Court’s legitimacy that he advanced by means of such obvious partisan obduracy.  But this fundamentally misunderstands the Republican mentality.  To them, power is its own principle, its own end, its own justification.  Tarnishing the Supreme Court is a practical impossibility according to this system.  If your constituents do not judge you on the basis of concepts like civility, probity, or nobility of purpose, it is impossible, ipso facto, to be found wanting in respect of them.

The Kavanaugh hearings afforded Republicans a signal opportunity to exercise their power in rather stark and brutal terms.  They did this by first browbeating, and then steamrolling, their opponents.  The fact that Kavanaugh had been credibly accused of sexual assault only added to his luster.  What may have appeared merely sexist, boorish, and overbearing to liberal eyes, was for the conservative observer a titillating display of domination.  I strongly urge liberals not to misjudge the allure of this primal force, which is, I suspect, the governing animus behind the Trump cult of personality.  It cannot be bargained with, cajoled, placated, or pacified.  The sooner Democrats accept this, the sooner they can attend to the only business that really matters — winning elections.

But Democrats, recent midterm successes notwithstanding, simply do not have a plan.  They’re much too busy wringing their hands and rending their garments over the latest Trumpian outrage, or else attacking their own allies for a perceived lack of ideological purity.  This is a waste of precious energy, for it focuses much too much attention on the exigencies of the moment.  Whatever else may be said about Mitch McConnell, he is an able strategist.  Like others of his tribe, he long ago foresaw the demographic demise of the Republican coalition, and has done everything in his power to set up a store of judges against the coming winter.  In this he has succeeded, I am sure, beyond his own wildest imaginings.

It’s high time we broke the pointless cycle of liberal shock and outrage.  It’s time we took it for granted that conservatives will throw off every decent restraint to their ascendancy.  Hypocrisy, spite, and boundless prevarication are endemic to their strategy, not negotiable anomalies of an otherwise reasonable governing philosophy.  They are bent on an antidemocratic vision of enduring political dominance, a species of minority rule reinforced by nakedly partisan judicial and legislative policies they call “originalism” and “states’ rights.”  And, as recent events have shown, they are perfectly comfortable with the violence that springs from their rhetoric.

The only lasting solution is for Democrats to start winning more elections.  How this might be effected is still a matter of bitter debate, but I have a few modest observations.  Firstly, we must acknowledge that the conventional wisdom, which holds that Democrats abandoned their blue collar base, is largely correct.  Working class people have virtually no representation in the upper echelons of the party, and rightly perceive that their interests attract little attention among the wonks and technocrats.  This can only be remedied by sponsoring the election of people who actually belong to the working class — say, by setting aside five or ten percent of all open contests and recruiting the best available blue collar candidates to run for those offices.  The safest course would be to start with local elections, and cultivate the most competent and successful candidates for consideration of successively higher offices.

Adoption of such a policy would demonstrate a willingness to share political power with the working class.  It would also trigger a gradual realignment of the electorate away from cultish tribalism and towards the formation of rational coalitions oriented around shared interests.

Secondly, the Democrats need to break the conservative stranglehold on the news cycle.  Mass media is like the Eye of Mordor, darting its anxious lighthouse beam to and fro.  Trump has learned how to direct that gaze whither he will — as during the run up to the election, when he kept it firmly fixed on the migrant caravan from Central America, and away from anything of substance.  It may seem like magic, but it’s really nothing more than misdirection, the kind of sleazy sleight-of-hand practiced by shoddy charlatans since the dawn of civilization.  What makes his own particular brand of misdirection so potent is his utter lack of restraint; it isn’t just that he lies, but that he does so with such frequency and transparency.  This has the paradoxical effect of absorbing the attention of reporters and commentators, trapping them in the intoxicating urgency of Now.

Of course, the presidency and the media did not evolve independently; ever since the introduction of television, presidents have been symbols and semaphores on the world stage, their every gesture laden with hidden portents.  This has only intensified in the internet age.  Trump inherited an information network attuned to even the merest tremors of presidential syntax, so it is no surprise that he has overwhelmed that system by the sheer magnitude of his fabulism.  Journalists have been slow to recalibrate their instruments to the Trump scale, perhaps to avoid being numbed into a state of cynical stupefaction.  But their unflagging attention to his (patently absurd) speculations and his (unhinged) personal insults has only served to drown the signal in the noise.

The Democrats need to respond with discipline:  coordinated messaging, forceful rebuttals, and focused leadership.  They must distract the Eye away from Trump’s targets and towards the failures of his Administration.  They must proselytize their own policies with tireless intensity.  And most importantly, they must articulate a clear vision of the kind of country they want to create, and promote that vision by collecting and crafting narratives from the interest groups they purport to represent.  Find the right spokesmen — telegenic, quick-witted, and charismatic — and instruct them in the Great Democratic Narrative.  Enjoin them to hew to that narrative with every sinew and synapse at their disposal, what storms may come.  Over time, this narrative will begin to insinuate itself into public consciousness, and the restless Eye will settle more and more on the unaccustomed images of real suffering and injustice that have languished hitherto for want of scrutiny.

Thirdly, the fractious party of the donkey must discover hidden harmonies within their coalition.  I’m sure many qualified aspirants will emerge for the 2020 nomination, but the party leadership must ensure that the Democratic primary does not become an internecine bloodbath.  Assemble together in a smoke-filled room and whittle the candidates down to the strongest few.  Strike whatever deals you must with the disappointed wannabes, but avoid at all costs the kind of carnival atmosphere that afflicted the last several Republican primaries.

Finally, Democrats need to adopt a harder negotiation posture.  One of the many frustrations of the Obama era was the president’s tendency to assume good faith on the part of his political opponents.  Again and again he reached out his hand in fellowship, only to have it slapped away by Mitch McConnell or Paul Ryan.  In perhaps the most famous instance of this practice, he abandoned single payer before ever sitting down with the opposition to hammer out a deal on healthcare.  This left many of his constituents scratching their heads at what appeared to be a policy of preemptive conciliation.  I think I speak for all such constituents when I say:  by all means, compromise; but at least start from a position of strength.

Republicans will not see placatory gestures as acts of generosity or good faith — they will see them as signs of weakness.  They will exploit whatever advantages you proffer them.  That is their nature.  Scorpions sting; crows feed on carrion; wolves howl at the moon.  Republicans prey on the weak.  Democrats would do well to remember that.