It's All Over (2021)

Don’t forget to listen to my podcast, “What Is X?” And check back for new episodes soon: with Danielle Carr on Mental Health, Sean Caroll on Matter, Julian Lucas on Memory, and more.

Swamped by deadlines and scheduling conflicts I would need a whole retinue of privy councillors to disentangle, I am constrained this week to re-run an old classic: the 2018-19 essay, “It’s All Over”, which first appeared on my personal pre-Substack website, jehsmith.com, on the morning of New Years Eve, January 31, 2018, and was a few days later re-posted by The Point Magazine. Many readers will already be familiar with this essay, while for others it will, I hope, serve as an illuminating bit of back-story. Even if I were not overwhelmed with other responsibilities, it would likely still be worth running the piece again here, as in an important sense it is the without-which-not of much of what I am doing still today. It is the one piece of my writing that ever approached proper virality, and it is the germ of the book that would eventually become The Internet Is Not What You Think It Is. It is for this book in turn that I initially created this very Substack, “The Hinternet” as it’s still officially called, as a sort of running para-textual supplement. A few days ago a reader rediscovered the piece from three years ago, as occasionally happens, and it began to make the rounds on Twitter again. I re-read it for the first time in about two years, and unlike most of what I write (which usually only triggers the Knausgaardian plaint: “Not good enough!”), I found that I was basically still happy with it. It continued to stand up.

I wrote the essay in a Motel 6 off of Interstate 99 near Bakersfield, California, traveling from San Diego to Sacramento, with a bag full of Taco Bell and a bottle of wine from Safeway, as was still my habit back then on visits to my home country (quite unfathomably to me today). I was angry, and empty, and it shows. I still am, but the quality of these feelings has transformed. I think I did diagnose our condition correctly; by the latter part of the last decade, things really were “all over”. But something grows out of everything that dies and rots. This ‘stack, for example, and my next book, and the books in the works after that, and the expectation of bright new things it’s still too early to make out. It’s all over, yes, but even so…

Incidentally, before we begin in earnest, my book is available for pre-order here.

Is there any way to intervene usefully or meaningfully in public debate, in what the extremely online Twitter users are with gleeful irony calling the “discourse” of the present moment?

It has come to seem to me recently that this present moment must be to language something like what the Industrial Revolution was to textiles. A writer who works on the old system of production can spend days crafting a sentence, putting what feels like a worthy idea into language, only to find, once finished, that the internet has already produced countless sentences that are more or less just like it, even if these lack the same artisanal origin story that we imagine gives writing its soul. There is, it seems to me, no more place for writers and thinkers in our future than, since the nineteenth century, there has been for weavers.

This predicament is not confined to politics, and in fact engulfs all domains of human social existence. But it perhaps crystallizes most refractively in the case of politics, so we may as well start there.

There are memes circulating that are known as “bingo cards,” in which each square is filled with a typical statement or trait of a person who belongs to a given constituency, a mouth-breathing mom’s-basement-dwelling Reddit-using Men’s Rights Activist, for example, or, say, an unctuous white male ally of POC feminism. The idea is that within this grid there is an exhaustive and as it were a priori tabulation, deduced like Kant’s categories of the understanding, of all the possible moves a member of one of these groups might make, and whenever the poor sap tries to state his considered view, his opponent need only pull out the table and point to the corresponding box, thus revealing to him that it is not actually a considered view at all, but only an algorithmically predictable bit of output from the particular program he is running. The sap is sapped of his subjectivity, of his belief that he, properly speaking, has views at all.

Who has not found themselves thrust into the uncomfortable position just described, of being told that what we thought were our considered beliefs are in fact something else entirely? I know I have been on many occasions: to be honest, this happens more or less every time I open my newsfeed and look at what my peers are discoursing about. For example, I admire Adolph Reed, Jr., a great deal; I believe he is largely correct about the political and economic function of “diversity” as an institutional desideratum in American society in recent decades; and I believe, or used to believe, that I had come to view Reed’s work in this way as a result of having read it and reflected on it, and of having found it good and sound.

But then, not so long ago, I happened to come across this from an American academic I know through social media: “For a certain kind of white male leftist,” my acquaintance wrote, “Reed makes a very convenient ally.” What would be a fitting response to such exposure as this? Should I stop agreeing with Reed? Easier said than done. It is never easy to change one’s beliefs by an exercise of will alone. But it would be will alone, and not intellect, that would do the work of belief change in this case: the will, namely, to trade in the algorithm I’m running for one that, I’ve recently learned when checking in on the discourse, is preferred among my peers.

Another example: I have read that Tinder users agree that one should “swipe left’” (i.e. reject) on any prospective mate or hookup who proclaims a fondness for, among other writers, Kurt Vonnegut, Ernest Hemingway or William S. Burroughs. I couldn’t care less about the first two of these, but Burroughs is very important to me. He played a vital role in shaping how I see the world (Cities of the Red Night, in particular), and I would want any person with whom I spend much time communicating to know this. I believe I have good reasons for valuing him, and would be happy to talk about these reasons.

I experience my love of Burroughs as singular and irreducible, but I am given to know, when I check in on the discourse, that I only feel this way because I am running a bad algorithm. And the result is that a part of me—the weak and immature part—no longer wants the overarching “You may also like…” function that now governs and mediates our experience of culture and ideas to serve up “Adolph Reed” or “William S. Burroughs” among its suggestions, any more than I want Spotify to suggest, on the basis of my playlist history, that I might next enjoy a number by Smash Mouth. If the function pulls up something bad, it must be because what preceded it is bad. I must therefore have bad taste, stupid politics; I must only like what I like because I’m a dupe.

Share

But something’s wrong here. Burroughs does not in fact entail the others, and the strong and mature part of the self—that is to say the part that resists the forces that would beat all human subjectivity down into an algorithm—knows this. But human subjects are vanishingly small beneath the tsunami of likes, views, clicks and other metrics that is currently transforming selves into financialized vectors of data. This financialization is complete, one might suppose, when the algorithms make the leap from machines originally meant only to assist human subjects, into the way these human subjects constitute themselves and think about themselves, their tastes and values, and their relations with others.

I admit I am having trouble at present differentiating between the perennial fogeyism that could always be expected of people who make it to my age (I’m 46), and the identification of a true revolutionary shift in human history. But when I check in on the discourse, and I witness people only slightly younger than myself earnestly discussing the merits of action-hero movies that as far as I can tell were generated by AI, or at least by so much market data as to be practically machine-spawned, I honestly think I must be going insane.

Spider-Man: Into the Spider-Verse looks essentially the same to me as these videos that have been appearing on YouTube using copyright-unrestricted lullabies and computer graphics designed to hold the attention of infants. “Johnny Johnny Yes Papa,” for example, now has countless variations online, some of which have received over a billion hits, some of which appear to be parodies, and some of which appear to have been produced without any human input, properly speaking, at all. It is one thing to target infants with material that presumes no well-constituted human subject as its viewer; it is quite another when thirty-somethings with Ph.D.s are content to debate the merits of the Marvel vs. the DC Comics universe or whatever. If I were an algorithm, and I encountered an adult human happily watching Spider-Man, I would greet that human with a “You may also like…” offer to next watch “Johnny Johnny Yes Papa” on a ten-hour loop. That is how worthless and stunting I think this particular genre of cultural production is.

Professional sport has long been ahead of the curve in depriving those involved in it of their complete human subjecthood, and it should not be surprising that FIFA and NFL and similar operations are producing for viewers the one thing even more stupid and dehumanizing than Hollywood’s recent bet-hedging entertainments. Is there any human spirit more reduced than that of an athlete in a post-game interview? The rules of the game positively prohibit him from doing anything more than reaffirming that he should like to win and should not like to lose, that he has done his best or that he could have tried harder; meanwhile, the managers and financiers and denizens of betting halls are reading him up and down, not as a subject with thoughts and desires at all, but as a package of averages, a bundle of stats. This process of deprivation was famously (and to much applause) accelerated in recent decades when new methods of mathematical modeling were applied in managerial strategies for team selection and game play. In even more recent years the tech companies’ transformation of individuals into data sets has effectively moneyballed the entirety of human social reality.

I see this financially driven destruction of human subjecthood as the culmination, and the turning inward and back upon ourselves, of a centuries-long process of slow mastery of the objects of our creation as they move through the natural environment. The first vessels to cross oceans simply set out as singular physical entities, as wood in water. But by the age of global colonialism and trade, ships were not just physical constructions. They were now insured by complicated actuarial determinations and economic commitments among men in the ships’ places of origin, and these operations, though they left no physical mark on the individual ship that set out to sea, nonetheless altered the way ships in general moved through the sea, the care the captain took to avoid wrecks, to log unfamiliar occurrences, to follow procedure in the case of accidents.

It seems this transformation, from physical object to vector of data, is a general and oft-repeated process in the history of technology, where new inventions begin in an early experimental phase in which they are treated and behave as singular individual things, but then evolve into vectors in a diffuse and regimented system as the technology advances and becomes standardized. In the early history of aviation, airplanes were just airplanes, and each time a plane landed or crashed was a singular event. Today, I am told by airline-industry insiders, if you are a billionaire interested in starting your own airline, it is far easier to procure leases for actual physical airplanes, than it is to obtain approval for a new flight route. Making the individual thing fly is not a problem; inserting it into the system of flight, getting its data relayed to the ATC towers and to flightaware.com, is. When I first began to drive, cars, too, were individual things; now, when on occasion I rent a car, and the company’s GPS follows me wherever I go, and the contract binds me to not drive outside of the agreed-upon zone, and assures me that if the car breaks down the company’s roadside service will come and replace it, I am struck by how ontologically secondary the car itself is to the system of driving.

The transformation of planes and cars from individual things into vectors of data in a vastly larger system has obvious advantages, safety foremost among them. An airplane is now protected by countless layers of abstraction, by its own sort of invisible bubble wrap, a technology descended from the first insurance policies placed on ships in the golden age of commerce.

This wrapping makes it possible for rational people (I am not one of them) to worry not about singular cataclysms, but rather about systematic problems that generally result in mere inconveniences, such as multi-plane backups on the runway. It is not surprising, in a historical moment in which such structural breakdowns are easily perceived as injustices, as occasions to ask to speak with a proverbial manager, that in more straightforwardly political matters people should spend more time worrying about structural violence than about violence: more time worrying about microaggressions or the emotional strain of having to listen to someone whose opinion does not entirely conform to their own, than about violence properly speaking, the blows that come down on individual heads like waves striking individual ships or individual birds getting stuck in individual jet engines on take-off.

Someone who thinks about their place in the world in terms of the structural violence inflicted on them as they move through it is thinking of themselves, among other things, in structural terms, which is to say, again among other things, not as subjects. This gutting of our human subjecthood is currently being stoked and exacerbated, and integrated into a causal loop with, the financial incentives of the tech companies. People are now speaking in a way that  results directly from the recent moneyballing of all of human existence. They are speaking, that is, algorithmically rather than subjectively, and at this point it is not only the extremely online who are showing the symptoms of this transformation. They are only the vanguard, but, as with vocal fry and other linguistic phenomena, their tics and habits spread soon enough to the inept and the elderly, to the oblivious normies who continue to proclaim that they “don’t like reading on screens,” or they “prefer an old-fashioned book or newspaper,” as if that were going to stop history from happening.

I have a book coming out soon, called Irrationality, which attempts to articulate some of these ideas (though some only came to me after submission of the final manuscript). I am struck by how much, at this point, what we still call “books” are no longer physical objects so much as they are multi-platform campaigns in which the physical object is only a sort of promotional tie-in. I have found myself coming away from discussions with my good PR people feeling vaguely guilty that I do not have enough followers on Twitter (five thousand is the cut-off, I think) to be considered an “influencer,” or even just a “micro-influencer,” and feeling dismayed to learn that part of what is involved in launching a book like this into the world is strategizing over how to catch the attention of a true influencer, for a retweet or some other metrically meaningful shout-out. You would be a fool to think that it is the argument of the book, the carefully crafted sentences themselves, that are doing the influencing.

And yet for me to try to insert myself into the metrics-driven system would be a performative contradiction, since the book itself is an extended philippic against this system. And so what I do? I play along, as best I can, until I start to feel ashamed of myself. I contradict myself.

Am I just a disgruntled preterite, who couldn’t play the new game, couldn’t gain the following that would have made me believe in this new system’s fundamental justice? I know that ten years ago I was very optimistic about the potential of new media to help advance creative expression and the life of ideas, and I acknowledge that I have done much cultivation of my own voice online. Perhaps it is not the medium that I should blame for my present disappointment, but rather the limitations of my own voice. I admit I consider this possibility frequently. Perhaps the book isn’t even that good. Perhaps there is a bingo card out there already anticipating everything I say in it. Perhaps silence is the only fitting response to the present moment, just as it would have been fitting to put down one’s needle and thread when the first industrial looms were installed, and to do something—anything—else to maintain one’s dignity as an artisan.

I often think of an essay I read a while ago by a prize-winning photojournalist who had tracked down Pol Pot deep in Cambodia, had taken pictures of him, spoken with him, conveyed this historical figure’s own guilty and complicated and monstrous human subjectivity to readers. The essay was about the recent difficulty this journalist had been having paying his bills. He noted that his teenage niece, I believe it was, had racked up many millions more views on Instagram, of a selfie of her doing a duck-face, than his own pictures of Pol Pot would ever get. She was an influencer, poised to receive corporate sponsorship for her selfies, not because any human agent ever deemed that they were good or worthy, just as no human agent ever deemed “Johnny Johnny Yes Papa” good or worthy, but only because their metrics signaled potential for financialization.

My own book may be crap, but I am certain, when such an imbalance in profitability as the one I have just described emerges, between photojournalism and selfies, that it is all over. This is not a critical judgment. I am not saying that the photos of Pol Pot are good and the selfies are bad. I am saying that the one reveals a subject and the other reveals an algorithm, and that when everything in our society is driven and sustained in existence by the latter, it is all over.

What to do, then? Some of us are just so constituted as to not have quietism as an option. There are ways of going off the grid, evading the metrics, if only partially. One can retreat into craft projects of various sorts, make one’s own paper and write on it with fountain pens, perhaps get a typewriter at an antiques store and write visual poetry with it by rotating the paper and pounding the keys with varying degrees of force. But it is hard not to see this sort of endeavor as only the more twee end of the normie’s prideful declaration that he still has a paper subscription to the Times. It changes nothing.

As we enter our new technological serfdom, and along with liberal democracy we lose the individual human subject that has been built up slowly over the centuries as a locus of real value, we will be repeatedly made to know, by the iron rule of the metrics, that our creative choices and inclinations change nothing. Creative work will likely take on, for many, a mystical character, where it is carried out not from any belief in its power to influence the world as it is at present, as it may remain for the next millennia, but as a simple act of faith, as something that must be done, to misquote Tertullian, because it is absurd.

Human beings are absurd, or, which is nearly the same thing, irrational, in a way that algorithms are not, and it was this basic difference between these two sorts of entity that initially made us think we could harness the latter for the improvement of the lives of the former. Following a science-fiction plot too classic to be believed, our creation is now coming back to devour our souls.

One week after the above essay appeared on my personal website, I wrote a follow-up post entitled “Some Replies to Critics”. I may as well share that too:

I can’t possibly respond to all the comments I've received in response to the cri-de-coeur I wrote and posted on December 31 under the title “It’s All Over”, which was then re-posted by The Point Magazine a few days later. I really was not expecting so much interest in it. It was just another 3000-word essay, cranked out in under two hours in a Motel 6 in Bakersfield, and I assumed that it, like so much else I post here at jehsmith.com, was destined, if I may adapt a line from David Hume, to launch stillborn from the TypePad. 

The essay touched a nerve, and most of the response to it was positive. One common mistaken interpretation, to which I want to respond, was that it amounts to an expression of “conservatism”. We are at a strange point indeed in our culture when a scoffing and dismissive attitude towards Hollywood entertainments such as action-hero movies, generated by market forces alone, may be seen as conservative. I continue to believe in a culture independent of these forces, and I bemoan the obsession of so many in our present age with monitoring the garbage output of the entertainment industry for signs of this industry’s affirmations of progressive values. I do not care about this industry. I think true progressivism consists in rejecting it, not in proclaiming the latest iteration of Wonder Woman “good” because it managed to stay on-message relative to some particular conception of feminism, or that some movie with Emma Stone in it is “bad” because the lead role should have gone to a person of color. Who gives a shit? Who has time for this kind of stuff? Woke celebrity-gossip-mongering is still celebrity-gossip-mongering, and no one is going to convince me that it counts as political in any meaningful sense. Let’s make our own culture instead, with bold new visions of what art might be, rather than pushing Hollywood business moguls to do it for us.

I admitted in the essay to a certain fogeyism, so let me pull out a fogey trope and tell you how things were in the old days. In my late teens I used to drive thirty miles each way to go to the nearest art-house video-rental store, in order to take back home VHS tapes of the works of Ousmane Sembène, rightly called the Chekhov of Senegal. I loved his cinematic language, and I felt that through it I was gaining access to a certain true depiction of Africa. This and similar experiences leave me nonplussed and, yes, a bit angry, when, years later, I find myself reading excited young people declaring on Twitter that, with the release of Black Panther in 2018, we are “finally getting to see Africa depicted in film”. But that is not Africa; that is some fantasy bullshit. Africa has already been depicted in film going back several decades, by great African directors such as Sembène, and you are just not working hard enough if you expect movies to be delivered up to you as mass-release big-screen entertainments. I recently gave in and selected Black Panther while on a very long flight. I turned it off after ten minutes or so. It was just too stupid. I predict moreover that it will not age well, that future generations of people committed to racial justice and equality will be baffled as to how anyone in 2018, ostensibly committed to the same, could have seen this vain moneymaker as a contribution to the cause.

Among bold new visions from the last century of what art might be, I definitely count, say, John Coltrane’s confiscation and deconstruction of “My Favorite Things,” and this brings me to comment on the frequent mention of Theodor Adorno I saw in the discussion of my essay (“like Adorno confronted by Twitter,” is what one commenter wrote). Of course I find any comparison to Adorno a compliment, but I note that while I disagree with him about jazz, I do not believe that Adorno’s views on jazz make him a “conservative”. I think there is something deeply wrong with a critical culture that can only read deep misgivings about a given trend in popular culture in this way. I like jazz, anyhow, and I like it because it emerged as, and at its best remains, the free expression of creative subjectivity. It is not the algorithmically generated output of market forces, even if, of course, as they always do, these forces later co-opted it and made it as bad as they could. I like mumble rap, asthma rap, and all the spontaneous creativity that these market forces have not yet figured out how to ruin, and I find ridiculous the billionaire rappers my age who, spoiled by these forces and evidently forgetful of their own pasts, are currently denouncing these new styles as “not real hip-hop”. I do not think there is anything Adornonian about this sensibility of mine, even if I find much to agree with in Adorno’s analysis of the culture industry.

Still less is there anything Scrutonian. I will agree that Roger Scruton is, unlike Adorno, a true conservative, and I will say that I did not take the comparison one commenter made of me to him as a compliment. I believe the comparison had something to do with the fact that I seem, to those who are not reading carefully, to be defending “high-brow” tastes in art and culture. But this is not so. I like music made by hillbillies, swamp-dwellers, convicted felons, and peasants. I think opera is boring (though I do still go to see it with my mom, because she has a good time). I like wine, but my only criterion for considering it good is that it be cheap. I despise the affectations of foodies (I have however known at least a couple of foodies who seem sincere and unaffected in their devotion to this quest, so if they are reading, they should know I'm not talking about them), and every other nervous expression of class refinement that so many bourgeois academic leftists in America feel the need to cultivate. When I come and give talks in the US, and my hosts fuss over which of the local stylish restaurants to go to, I yearn to just ask them to drive me through Taco Bell and get me back to my room as soon as possible so I can get some work done. At least Scruton is open about the fact that his own preoccupation with such refined tastes is bourgeois and exclusionary. The academic progressives always have to weigh down their performance of social distinction with the language of sustainability and reassurances of the rich free-range life the animal on the plate had before it was slaughtered. 

So enough with the “conservative” canard. This label does not apply automatically to every person who finds him or herself forced into the role of dissenter relative to the current deadening group-think and trivial distractions that pass for progressivism in the United States, and in particular in Anglophone social media.

I am also willing to admit that to some extent the essay boils down to the formula: “Here is some stuff I like, and here is some other stuff I don’t like,” and indeed that some of what I say about the stuff in the latter category might also apply, from a different angle, to the stuff in the former category. And in fact I don’t dislike some of the stuff in the “bad” category as much as I pretend to. I find “Johnny Johnny Yes Papa” uncannily fascinating, for example, not in spite of but because of its algorithmic origins. As for comic-book movies, by contrast, I will just say that I did not state my contempt nearly as strongly as I could have. I hate everything Spider-Man stands for, and I'll take a reboot of a 19th-century lullaby about a naughty child eating sugar over a fantasy story about a vigilante crime-fighter any day, no matter how much self-styled “progressives” try to convince me it is my tastes that are conservative, not theirs.