The Substack Discourse and the Self-Referentiality of Everything

0. Where Is This Going?

I would like to join the debate about Substack, its promises and limits. But it’s going to take me a while to get there. Please be patient, and do not abandon me in the antechambers of the argument.

1. The Plastic Tower

You might recall the strange case of Matthew J. Mayhew, professor of educational administration at The Ohio State University. In late September he published an opinion piece in Inside Higher Ed enumerating the many supposed virtues of college football. A week later he issued, in the same venue, an abject apology for his first piece, in which, he now confessed, he had failed to recognise the various ways his support for collegiate athletics perpetuated white supremacy and failed to center the voices of people of colour. “I am just beginning to understand,” he wrote, “how I have harmed communities of color with my words. I am learning that my words —my uninformed, careless words— often express an ideology wrought in whiteness and privilege.” 

One could not help but try to imagine the struggle session to which Mayhew was subjected that week, from which he emerged as if reborn. It seems hard to deny that he is sincere in his follow-up piece —the common view that he is writing as if there is a gun held to his head misses the mark—, but also totally and radically converted from one way of seeing the world to another, a conversion that can typically only occur where there is significant social and institutional pressure. (For what it’s worth, I have long believed that college athletics programs are racist, and for that among many other reasons I have long argued for their abolition.)

Read both pieces for yourself and try to reconstruct what might be going on. What makes this particular road-to-Damascus moment so intriguing to me is what I was able to learn about Mayhew’s career prior to the conversion for which he was destined to become widely known. Although he and I are both technically academics, Mayhew is someone with whom I would have absolutely nothing to talk about if, by a twist of fate very unlikely to happen at this point, I were seated next to him at some rubber-chicken-and-ice-water teaching-awards dinner, let us say, in Columbus. I consider myself a pretty wide-ranging conversation partner. You tell me you work on cosmic background radiation or Antarctic ice-core palaeoclimatology or Jane Austen and I will be into it. I will recognise in you a share in a common project that unites us under the umbrella of the university as it was understood from the eighteenth century until around 2008.

Mayhew’s career, which begins well before that critical year but is also a harbinger of it, is one that has been built entirely on tracking and echoing the transformations of the university itself. He obtains research funding for projects with names like “Assessment of Collegiate Residential Environments and Outcomes”, and publishes in volumes with titles like The Faculty Factor: A Vision for Developing Faculty Engagement with Living Learning Communities. He has an h-index, according to Google, of 34, which indicates that he is doing whatever it is he is supposed to do according to the rules —which increasingly is to say the algorithms— that shape the profession. And this is where I think his spectacular public recantation is significant: hewing so close in his career to the vicissitudes of the institution that both pays him and constitutes the object of his study, sooner or later Mayhew could not fail to embody and express, through his own personal conversion, the conversion of higher education to whatever you want to call this peculiar new sensibility that has transformed large sectors of American society in the Trump era.

The United States has never been good at producing public intellectuals, but new trends in the present century bring our country’s public discourse even further from anything one might dare to call the life of ideas. As in every other domain of public life, a peculiar political polarisation has occurred: on the right (and among the defenders of classical liberalism, “reason”, the “Enlightenment”, etc.), the guiding lights are coming from psychology departments, or from that strange hybrid zone between psychology and business. Steven Pinker, Jonathan Haidt, and others are thus put on a public stage and expected to hold forth on all that is human, but their model of the human is one that for the most part extends back no further than the late nineteenth century, and for the most part takes us as bundles of instincts nudged this way and that by stimuli. They are not humanists, in the significant sense of this term that extends back to the Renaissance, and yet they are adjuncting as humanists for a culture that does not know to expect any better.

Meanwhile, on the progressive left, the academic neighbourhood that is churning out public figures is even more tenuously rooted in humanistic tradition. Roxane Gay, Robin DiAngelo, Freddie deBoer (who is great when he’s talking about anything other than his academic speciality), and many others first entered into public life on the basis of their advanced credentials in the field of education, or of scholarly work focused on what happens in the classroom. I suppose if we are reading Rousseau or Dewey on the subject (just as if we are reading William James on psychology), we are maintaining our connection to humanism. But this is not typically what goes on in graduate schools of education. There you are more likely to find books with titles like How College Affects Students: 21st Century Evidence that Higher Education Works, to cite the title of one of Mayhew’s co-authored works.

As far as I’m concerned, universities are where you go to learn how to read Akkadian cuneiform tablets, the scansion of Ovid, and stuff like that. Of course, someone has to think about how to actually run the universities, and the laudable principle of self-government would seem to require that at least some academics devote a portion of their energies to compiling data on how well higher education works, though ironically this principle is being eroded at the same time as we are witnessing the proliferation of new epicycles of academic self-reflexivity.

Mine is to some extent an echo of a line Stanley Fish was pushing for a while (Fish’s postmodernism now appears positively humanistic in comparison with what came next): a university is a place for discovering universes in grains of sand, drawing these universes out for others to see, enriching society by connecting to and preserving bonds with things that lie beyond our society (Mexica temple architecture, quasars, Great Zimbabwe, whales). The turn to identitarian me-search and self-referential preoccupation with the university as an object of study —not the history of the university, but the university in its current administrative functions and social dimensions— is in my view a betrayal of the legacy of humanism, and I have resolved to spend the rest of my career, come what may, trying to preserve what I can of its surviving threads, like some sombre Isidore of Seville in the very last moments of late antiquity.

I believe that everyone, for the sake of their own thriving as human beings, should be required to study at university only things that have nothing to do with their own life up until that point. Curricula should not be made to be “relatable”; students should be encouraged rather to discover and cultivate relations to ideas, values, and traditions they had not previously known to exist. This is the ideal of the university that was still more or less intact when I was an undergraduate in California in the early 1990s. It is certainly the ideal that reigned at the University of Leningrad when I went there as an exchange student in the waning hours of the Soviet Union. The USSR developed world-class traditions in archaeology, linguistics, philology, etc., in much the same way it produced astronauts and world-class Olympic athletes even amidst constant economic hardship. Give me a choice between the late-communist university and the late-neoliberal university, and there’s no question which one I prefer: I prefer the one that hasn’t forgotten what the humanities are.

Share

2. Words and Objects

The turn to self-reflexive study, I take it, is part of the increasingly desperate justificatory project of an economically unviable sector, exacerbated by the financial crisis of 2008, the expanding tuition bubble, and the rise of social media and what may justly be called the emergence of the social-media-academic-industrial complex.

If I have started with higher education, this is only because the lessons of its pathologies are easily transferable to other domains, notably to publishing, journalism, and entertainment. These are the fields that have been most significantly transformed by social-media-industrial complexes of their own. The International Grains Council has a Twitter account, but its expert updates on grains, rice, oilseeds, and dry bulk freight markets are not significantly shaped by social-media user engagement. Things are otherwise for books, movies, undergraduate teaching, and academic research funding. It is, namely, wherever an industry developed around what G. W. Leibniz would call “the commerce of light”, as distinct from the commerce of natural resources and foodstuffs, that things have gone particularly haywire over the past years.

To help us understand what is going on, it might be worthwhile to focus on the strange example of rhetoric around the topic of book burning. Recently a handful of trans-rights advocates have declared that they would like to see Abigail Shrier’s Irreversible Damage: The Transgender Craze Seducing Our Daughters cast into the flames; notably, a lawyer for the ACLU joined the call, revealing the increasing tension between the old liberal left and the emerging illiberal one. Defenders of classical liberalism piped up, as if on cue, with the usual lofty rhetoric about freedom of speech, and the go-to quote from Heinrich Heine to the effect that wherever they are burning books, they will be burning human beings soon enough. But if the liberals’ worries seemed to ring hollow, this probably had more to do with the historical contingency of books themselves than with our society’s forgetfulness of the timeless value of free expression.

Book-burning is plainly not the same thing today as it was when William Tyndale’s English translation of the Bible was thrown onto the pyre, and the church retained some realistic expectation that no eyes should ever land upon the word of God written in a national vulgate. Yet many today continue to talk about books as if they were rare and precious repositories of otherwise hidden truths, as if some unscrupulous Siberian hadn’t already created a pdf of Shrier’s work, available to any internet user able to master that all-important twenty-first-century Russian word: скачать. I understand that it’s the symbolism of the deed that the liberals are worried about, but this symbolism only means what it does because it recalls another era of history, in which book-burning was more than symbolic, but in fact had the power to interrupt the flow of information.

The commerce of “light” in our era has grown capacious enough to include not just books, but also most disposable consumer goods, which are being produced far in excess of what the planet can handle, and which barely seem intended for consumers at all at this point. What I am about to say is impressionistic, of course, but I was struck during the upheavals this past summer at what a casual affair looting had become. While reactionaries as usual took the raids on Target as signs of “anarchy”, of a violation of the sacrosanct institution of private property by people who “don’t want to work”, nonetheless owners, concerned citizens, and police alike for the most part seem to have silently agreed to just wait it out, while the raiders, for their part, seemed to be moving at a relaxed pace, not giddy or “amped” as one might expect, but mostly just unconcerned about the consequences of their once lethal mission. (For what it’s worth, I am neither “for” nor “against” looting; what I am for is an amelioration of the social and economic conditions that cause looting.)

If today Target seems mostly to just leave its doors open in moments of civil unrest, this perhaps has something to do with the fact that over the past few decades the real economy has shifted from mid-sized physical objects, of the sort that Target is known for selling, to data. The greatest harm of looting, in this new order, is that it interrupts the normal flow of barcode-scanning and data-tracking that companies like Target value… but then again I would not be all too surprised to learn that even the looting behaviour was seized upon as an opportunity for a new sort of data-extraction: What aisles do people run to when the doors are flung open and the cash-registers abandoned? And as long as Target shoppers are still fulfilling their function as data-nodes, what great difference does it make if the goods they take out the door are temporarily reclassified as “free samples”?

And in truth, just like the talking Kylo Ren figures in the close-out aisles, for the most part words today just aren’t worth much. There are simply too many of them floating around out there. Religious authorities are as fanatical as ever, but they gave up the fight long ago over how and through whom their doctrines circulate. Books, secular or religious, are now, ontologically speaking, more event than object, even if you might still opt for a commemorative physical copy of a book that matters to you.

Ex-Nike employees, when they leave that position to start a business of their own, do not typically begin the new chapter of their careers by stitching together tennis shoes, but rather by looking for investors who are keen on “taking deep-learning AI to the next level” and so on. The shoes are an afterthought. Books are an afterthought too: hasty mass-produced tokens that follow upon a prior and more fundamental process of tracking and mapping consumer “preferences”. That these preferences are themselves a result of the absence of significant cultural points-de-repère outside of this self-contained and self-reflexive system of preference-mapping goes without saying. The people who get rich from this system, in fact, would be very happy for it not to be said.

Share

3. Tenuis famae fedora / The flimsy fedora of fame

Academic curricula, books, movies, action figures: all afterthoughts. Nor is the news industry safe from this broad historical shift; in fact this is the domain of public life that is likely exhibiting the most extreme symptoms. On the one hand people whose livelihood depends on the survival of the industry still try to pass off their publications as if the role these played in the circulation of information were unchanged since the era of the early modern broadside. On the other hand the contents of these newspapers has less and less to do with transmitting the thoroughly reported news of the world from far flung foreign bureaus, and ever more to do with opportunities for self-expression of a select class of very young, very urban, very demographically and ideologically exceptional media professionals.

As is the case for academics, virtually all media professionals are labouring away in increasingly precarious conditions, uncertain of the future of their line of work. One consequence of these dire circumstances, again just as in academia, are the increasingly desperate displays of self-justificatory and over-the-top professionalism. A friend of mine who is a stunningly talented essayist submits a review to the Los Angeles Review of Books, and it is returned to him some weeks later with lovely sentences deleted, paragraphs rearranged, thoughts mangled. To subject oneself to this for the Los Angeles Review of Books? Why? One can’t help but picture the editor worried about preserving his employment, finding reasons to cut up a perfectly fine essay simply in order to reconfirm and reestablish each day anew that this editor is what he says he is. As with Sartre’s waiter, who is working overly hard to be the very thing he is, one can’t help but think that this is all an inauthentic performance. But unlike the Parisian café and its assorted players, the inauthenticity of the edit is exacerbated by the fact that the whole system is collapsing, or at least transforming beyond recognition.

With all this in mind, it is not at all surprising to see media personalities venturing off on their own to a venue such as Substack, at least those whose work is characterised by a strong individual voice. In the most high-profile cases —Andrew Sullivan, Glenn Greenwald, Matt Yglesias— the defectors are not essayists writing reviews of art-history books for free-websites like LARB, but rather commentators on our current political and cultural moment, moving away from publications that increasingly see the function of media as a venue for the self-expression and mutual reconfirmation of the views and interests of demographically and ideologically unusual urban young people. It is, let us at least agree, not necessarily a megalomaniacal unwillingness to be edited that would drive a person with a well-honed voice away from such an operation.

Of course, of course, peer-review in all its various forms is a desirable thing. The checks and balances provided by the co-workers and guild members we respect hold us up to higher standards than we are often able to hold ourselves up to alone. But when entrenched peer-review systems begin to break down, one is under no eternal duty to stay faithful to them, no more than one must fight to protect “books” as if these were an ahistorically stable thing. What one must fight to protect is the freedom of expression that books have historically embodied. And similarly, one must fight to protect the standards that peer-review would ordinarily protect in functioning institutions — if necessary, one must do it alone. Just after Glenn Greenwald had declared that he was leaving The Intercept, and the media-cum-social-media precariat worked itself into a frenzy about how irresponsible it is to write without an editor, again as if on cue New York Magazine featured a piece from Cazzie David entitled “Too Full to F***”. “You can’t always make room for a dick,” the lede explained, “if you’ve eaten dessert” (at first I understood the author’s dilemma as arising from the prospect of a second dessert). This is, to say the least, a reflection that would surely have been able to get its principal point across without the oversight of an editor or the imprimatur of a major media company.

Mine is not one of the high-profile Substack accounts, and I am either indifferent to or in disagreement with the majority of the views of the three I’ve mentioned by name. Yet I am deeply sympathetic to their reasons for defecting. I signed up with Substack in late August, within an hour of having learned of its existence. I did not yet know what it was, other than that it was a potential source of income. You can read all about that hopeful moment here.

What has struck me over the past several weeks of raging debate about the significance of the rise of Substack is the way in which this new venue has been absorbed, like absolutely everything that gets talked about in the social-media era, into the vortex of self-reference. One of the effects of this uptake process is that one cannot simply write on Substack; one can only write-on-Substack, and to do so comes with a whole host of connotations that the generic activity of writing lacks. Matt Bruenig belittled this new outlet by calling it “WordPress with a payment processor”, which isn’t really so bad. A much harsher though formally similar line I came across held that it was “OnlyFans for dudes with fedoras”.

This hurt, of course, as it’s really not how I think of myself. But just like so much of the production and slotting of human social types that happens in the social-media age, one once again experiences a jolting rift between our inner sense of ourselves and the public, digital self that seems to have a life of its own and that is largely beyond our control. I know in my heart that the relationship between the sex presumed of me since birth and who I am today is a very long, very complicated story; yet on the internet there is no room for that — I am “cis”. I know I don’t feel as if the harsh description of a typical Substack user should apply to me; but on the internet feelings are no match for algorithms — here, my aura wears a fedora.

Share

4. Keeping Two Books

In the beginning I thought Substack was a haven from the inane self-reflexive trap that dominates all public discourse in the social-media age. I was wrong about that. When the high-profile authors began to defect to Substack, I was happy, as I believed that, here as elsewhere, a rising tide lifts all boats. A recent profile of the new platform in the Columbia Journalism Review shows why I was wrong about this too.

In “The Substackerati”, Clio Chang asks whether the “newsletter company create[d] a more equitable media system—or replicate[d] the flaws of the old one.” As I should have expected, the founders of Substack got their project going only thanks to a significant injection of venture capital. This sort of injection does not come without strings, and in our age it is inevitable that these strings attach a newly funded project such as this to the same economy that drives all social media.

The founders thus got into the practice of assigning a “Baschez score” (named for a former employee) to prominent Twitter users in view of their engagement metrics. A score of four fire emojis would be sufficient to earn a user an invitation to move over to Substack, perhaps with some added financial incentives, even including the sort of advance you might once have expected to get for landing a book contract. Almost no one could hope to attain such a score, the founders admit, “without having already built up a reputation within established institutions.” In this respect, what is happening is best seen not as defection but as poaching, and the motion is from one terrible business model to another, newer one.

“Time and again,” Chang writes, “journalists have seen venture capitalists barge in on their newsrooms with claims that they’ll solve the industry’s problems, only to end up losing their jobs or being forced to churn out clickbait.” Substack’s leaderboards, “which were originally conceived to show writers what kind of ‘quality work’ was being done on Substack, were organized by audience and revenue metrics.” What Chang might have added is that on Substack as in academia, as in book publishing, as at the New York Times, the surest way to get those metrics is to make your stuff about the thing you’re doing, to turn your words away from the world and towards your own project of self-justification.

Just as I would like for universities to be places where people go to study Akkadian clay tablets, rather than, say, online assessment methods or the efficacy of multiple-choice tests, I would like to write on Substack about such things as whale falls, exoplanetary biology, Turkic comparative linguistics. Inevitably, though, what always gets the most engagement is stuff like what I am writing right now, “media criticism”, words that might pretend to “have their finger on the pulse” of our screen-addicted age, words that are entirely self-referential, just like more or less everything that has the hope of gaining any traction online. Substack is not a safe space, a haven from all the forces that deform and pervert language elsewhere on social media, and in all those fields that have entered into an industrial complex with social media.

The best solution I can see is to continue with the approach I have already begun willy-nilly to deploy — an “esoteric doctrine” every other week, for subscribers only, and an “exoteric” one, written in the vulgate of the internet and freely accessible to all, for the weeks between these. The exoteric one in any case matches fairly closely with the topic of the book that I am currently finishing (The Internet Is Not What You Think It Is, to appear from Princeton University Press in early 2021), and I’m told it’s good to “get out there” and promote one’s work in this way: to keep the plate spinning as it were, to make sure the book-object attains the status of book-event, and hovers there for at least a while.

5. The Message

Marshall McLuhan’s disciple, the great Walter J. Ong, added significant historical depth to his teacher’s line about the medium being the message by noting that, in fact, reverence for “the Word” has been a cornerstone of theology and metaphysics ever since we began to write and, in writing, to reflect on the strange powers and mutations of which language is capable.

Now logos is a difficult notion to interpret, but we are at least safe in saying that “In the beginning was the Word” translates its most famous occurrence far better than, say, “In the beginning was the Discourse.” In its current usage, this latter term is a faux-lofty ironising of something that would more properly be described as “chatter”, or, to invoke Heidegger’s lovely German term that sounds exactly like what it is, Geplapper.

The Word, which we may understand to designate language in any of its exalted uses, reveals the world to us. Discourse conceals the world from us behind words. When all available media for writing funnel us willing or not into “the Discourse”, you may be fairly certain that primary responsibility for the use and flow of words has been relinquished to machines, which, stupid as they are, mistake our language for data.

Do you want to read the “esoteric doctrine”? Then subscribe!

Share