America has always asked Black people to give everything we’ve got and then give what we don’t have. And if we did not give it, it was taken wilfully – plundered, as Ta-Nehisi Coates wrote.
Biles’s courageous decision echoes the actions of other Black public figures, from Naomi Osaka to Leon Bridges, who are refusing to sacrifice their sanity, their peace, for another gold medal or another platinum record. They are helping us all build a new muscle, helping us put one simple word at the top of our vocabulary: no.
Some might wrongly view this refusal as a symptom of millennial dysfunction and entitlement. The truth is that many of us came of age against the backdrop of 9/11 and the pyrrhic “war on terror”. We entered the workforce in the midst of the Great Recession. We cast our first votes for a Black president, only to then witness a reign of terror against Black people, young and old, at the hands of the state. (Not to mention the traumatic four years under our last president, who dispatched troops to brutalize peaceful protests for Black lives.) We are tired. We are sad. As the brilliant musician and producer Terrace Martin, perhaps best known for his work with Kendrick Lamar, told me recently: “I don’t know anybody sleeping well.”
I’m reminded of that great scene in the film Network, when the unstable newscaster convinces viewers all over the country to rush to their windows and scream out into the street: “I’m as mad as hell, and I’m not gonna take this any more!” Might Biles’s act of deep and brave self-love spark the largest wave of refusal in the history of this country? I believe it’s possible.
We are, I believe, witnessing the beginning of a great refusal, when a generation of Black Americans decide to, in the words of Maxine Waters, reclaim our time. Simone Biles, famous for what she does in the air, has shown the way by standing her ground.
The criticisms of this position are predictable, and Piers Morgan has already come out to predictably ridicule Biles’ exit, both highlighting the apparent relinquishment of her responsibility, whilst at the same time epitomising the hysterical levels of pressure put on these individual athletes, apropos of nothing:
Then Biles said something really extraordinary and illuminating: ‘I feel like I’m also not having as much fun. This Olympic Games, I wanted it to be for myself, but I came in and I felt like I was still doing it for other people. It hurts my heart that doing what I love has been kind of taken away from me to please other people.’
You’re not just at these Games for yourself, Simone.
You are part of Team USA, representing the United States of America, and hundreds of millions of American people watching back home, not to mention all the sponsors who’ve paid huge sums to support you.
And when you quit, you were performing as part of a gymnastics team, not yourself.
It’s also not supposed to just be about having fun.
The Olympics are the pinnacle of sport – the ultimate test of any athlete. They’re supposed to be very hard and very tough, physically, mentally, and any other way you care to name.
Biles blamed social media for her new-found nerves and self-doubt.
Morgan, as ever, is extraordinary and illuminating all on his own. Of course, who cares what he has to say, but he surely epitomises precisely what is being refused.
Beyond the entertainment news drama, what is interesting about Gerald’s essay and Morgan’s diatribe is, of course, that it emboldens this sense of a “Great Refusal” first put forth by Herbert Marcuse in his discussions of May ’68. Though Marcuse’s argument is in favour of revolution, and the Olympics may seem more like the deferral of such sentiments than a rupture to the status quo, Biles’ exit from the competition does bear “the mark of social repression” — to quote Marcuse — “even in the most sublime manifestations of traditional culture”.
Though its political impact may seem diffuse, it contains echoes of a wider rejection of capitalist work ethics and the social machine that implements it, affirming the Olympic Games not as the spectre of international quasi-capitalist competition but comradery and teamwork across “national frontiers and spheres of interest”. To quote Marcuse once more, it is an action shadowed by
the specter of a revolution which subordinates the development of productive forces and higher standards of living to the requirements of creating solidarity for the human species, for abolishing poverty and misery beyond all national frontiers and spheres of interest, for the attainment of peace.
Matheus shared an old blogpost of mine on Twitter earlier today, in which I argued that the right can’t seem to recognise memes anymore. This sentiment is relevant in Peru right now, he notes, because “Rafael López Aliaga, face of the Peruvian extreme right,” has used “the example of ‘imagine you have two cows …’ to explain ‘communism'”. This is presumably a dig at the newly elected socialist Pedro Castillo. “Such an ‘explanation'”, Matheus explains, “had become a meme during the last campaign in Peru to ridicule the intellectual flatness of the right.” Basically, Aliaga takes a meme ridiculing right-wing repetitiousness, and repeats it, humiliating himself.
We’ve certainly seen similar things happen over in the UK, but it is a particularly interesting example and made me think of a few things that I’d like to add to the old argument, as a way to explain why exactly the contemporary right has lost its meme literacy. (And how the left occasionally makes some of the same pitfalls.)
This is important to consider because the far-right was not always this inept. What defined Trump’s memetic election in the US in 2016 was, of course, a new “alt” right-wing coming to see itself as culturally influential political outsiders. And in being outside of a liberal orthodoxy, if not communicative capitalism’s networks of information exchange, their message was able to spread because it made convincing attacks on the system at large. It was an invading virus and, appropriately, went viral.
The viral analogy is important precisely because it captures the paradox at play. When the right started to meme, it infected the Republican Party in the US with a virus of its own creation. It got inside and began to replicate. This replication is a kind of creative destruction. Viruses attack the very systems that are allowing them to spread. And, as with a real virus, once a meme starts to replicate inside a system, it can start to cause serious damage through this replication process, perhaps to reputation or hegemon. But it is arguably the immune response, the fever, that makes us feel the most rotten.
What was so effective about the alt-right’s memes wasn’t just the types of messages they were sending, which were often aesthetically innocuous and banal, but the perceived damage that the meanings of these symbols were doing to the world at large. The meme itself isn’t that telling; what really makes us sit up and take notice is the immune response. Beyond the realm of biology, an immune response isn’t always a good thing. It shows what sets you off, reveals your weaknesses — at the level of politics, maybe even your irrelevance within the modern world. This is rule 101 of internet trolling: whatever makes you mad is immediately used as a point of leverage. That an idea, even a joke, can wholly disrupt and humiliate neoliberal management systems in this way is a powerful thing.
But what happens when you win? The alt-right’s meme campaign was successful. Their memes were no longer challenges to reality but became reality — and when that happened, all of the alt-right’s big names were gradually flushed out of the system. No longer “threats”, they faded into irrelevancy, digging their own graves, no longer on the outside pissing in but shitting themselves in public. Once the system built up an immunity to their provocations, working with their suddenly unavoidable presence, their repetitious personalities became old hat quick and were neutralised.
Though producing some juicy schadenfreude as the most nauseating pundits found their careers disappearing in media quicksand, we nonetheless transitioned into a world where we’re supposedly meant to accept the presence of a new right-wing, just like the coronavirus. “It’s like the flu, it won’t go away, we have to learn to live with it”, all the while ignoring the ways that we complacently allow or otherwise encourage the virus to replicate. It’s a bit like the “pingdemic” going on here in the UK, where the national immune response comes under attack, rather than the virus itself. In fact, most right-wing pundits in the UK have struggled with the pandemic precisely because they cannot see the virus, and so do not “see” the need to attack or react to it on an individual level. They fight for “common sense”, but nothing is more ideological than the apparent absence (or illegibility) of ideology. In much the same way, that the right cannot “see” memes is demonstrative of how complicit they are in the system under assault. They cannot see them because they are no longer invaders, and that is a problem when the virus is still ripping your cells apart.
It reminds me of a Zizek meme someone sent me the other day. Indeed, that blindness to your own politics is “pure ideology” is an acutely Zizekian point to make. But Zizek really is the perfect reference here, for both memes and the coronavirus response. To stick with Aliaga, that he is using “imagine you have two cows” to denounce communism as an ideology, when the whole point of that joke set-up is that it can be used promiscuously to illustrate the ridiculousness of any ideology, perfectly demonstrates Zizek’s point in The Sublime Object of Ideology, when he writes:
the social effectivity of the exchange process is a kind of reality which is possible only on condition that the individuals partaking in it are not aware of its proper logic; that is, a kind of reality whose very ontological consistency implies a certain non-knowledge of its participants — if we come to ‘know too much’, to pierce the true functioning of social reality, this reality would dissolve itself.
This is, in a way, capitalist realism. The “two cows” meme, in its original usage, represents certain gaps in an ideological consistency. It is a virus that invades from without, encouraging us to throw off our non-knowledge; encouraging us to understand the subjective position that would allow us to laugh along. But when the far-right appropriates this kind of joke, glossing over its dissenting message, repackaging it — or trying to — as a message that works in its favour, it shows how reliant on a certain non-knowledge the right really is. The meme, in their hands, only functions as they intend it to if the person reading it has no knowledge of its other uses — that is, of its other ideological applications.
We saw the same thing happen to Trump. He began as a provocation, as a meme, as a virus that wasn’t going to change the system at large but just force itself into circulation. But once liberal society had built up an immune response, it was inevitable that he would be flushed out and humiliated, because he didn’t expand knowledge but relies on the ignorance of others to get by. That being said, if Trump taught the world anything, it was that he was never really a threat to the system as a whole, just a certain elite that had failed to recognise him as one of their own.
And he was one of their own. That was what was so funny about him when he first emerged. He represented a blind spot in a system that could not understand its own offspring. The alt-right, in particular, took advantage of this, taking Trump’s “joke” candidacy and inverting it, making not a mockery out of him but the wider system that could allow him to exist. And yet, once they had succeeded, the joke wasn’t funny anymore — not simply because Trump is bad but because jokes like him are a kind of jouissance.
This is what made Trump the “accelerationist” candidate for some people — not because he was simply the worst candidate, although he certainly was, but because he was surplus of the system itself. Trump is neoliberal surplus. And people don’t really like surplus — it’s uncouth and all too often taboo. Of course, we English don’t really have a word for it that isn’t wrapped up in economic language, but the French call it jouissance. The jouissance of sex is what makes sex so taboo, for example — for a God-fearing world that wants to insist upon sex’s utilitarian child-producing function, the pleasure of sex is surplus to requirements. So too are our other bodily functions — shitting and pissing might bring about their own sort of pleasurable bodily relief, but they are taboo activities because they are the evacuation of a filthy excess produced by the necessary (and often itself pleasurable) activity of eating and drinking. (It’s no surprise that our most excessive forms of pornography combine the two.)
This was how Trump inserted himself into a neoliberal system. To the Republican Party, he was like a big loud orgasm — the pleasurable release that their reactionary circle jerk had long been waiting for. For the Democrats, he was a piece of shit… But in each instance, he was the product of the other’s private behaviours and public policies. No one wanted to talk about him as one of their own, but he was nonetheless a product of their world. For the rest of us, it made sense. That a nation like America, having long played an integral and supremacist role in global capitalism, could produce an egotistical businessman as president was the country taking a deep self-satisfied drag on its own excrement. (The same with other ridiculous leaders like him around the world.) And that was the joke — at least at first. But the problem was that it was a joke too easily integrated back into the system at large. Trump started as (and remained) an “accelerationist” candidate for some — notably Zizek — but, in the end, he only helped demonstrate how the system at large functions. Because it is surplus that capitalism exploits most effectively. We are hooked on excess. It is this surplus that also keeps us stuck in place. Our desires are overfed, which at once produces a permanent desire for more, for revolution, but also keeps its realisation at bay.
Roland Barthes writes about this in his 1973 book The Pleasureof the Text, exploring how literature contributes to this same dynamic of consumptive excess. Simply put, the intensive experience of reading about the details of another’s life, of creating a life through fiction and immersing oneself in it, is an aesthetic jouissance. It is an excess that satisfies us, entertains us, preoccupies us, but it also rouses us, even forces us to break with our own subjective positions. This is what jokes do, and what cinema and theatre notably does for Bertolt Brecht. It is also what memes do, at their finest. The wojaks of this world are dangerous precisely because they are consciousness-raising tools. Wojak is a readerly subject who expands (maybe even annihilates) the subjective position of the individual and introduces him (and it is often “him”) to others like himself.
Memetic repetition builds momentum in this way, through its surplus, but it is worth noting that the memetic is dependent on the new in a way that is other to communicative capitalism as a whole, precisely because memes are surplus to the system as a whole. They are what is naturally produced when audiences have chewed up and digested the culturally familiar. They are truly beholden to a (post)modernist “(re)make it new” imperative. This is different to how capitalism functions because capitalism relies on stasis. As such, when a meme becomes a catchphrase, oversaturated in its usage, maybe utilised by a corporation or a political leader, it has lost its power. It is no longer new but a familiar novelty. It no longer challenges but is part of a collective language that can only ever repeat. That is why new memes are essential to the meme market.
The new, Barthes notes, is one way that jouissance appears to us. The new, more than the familiar, is rousing. Not always, of course – “nine times out of ten, the new is only the stereotype of novelty” – but jouissance “may come only with the absolutely new, for only the new disturbs (weakens) consciousness”. What separates the new from mere novelty is perhaps a kind of excessive value in itself. Novelty is the new already captured. For Barthes, the new “is not a fashion, it is a value, the basis of all criticism”. The new, no matter how it appears to us, shocks us out of our complacency. It is always surplus to requirements by exceeding the bounds of the familiar. It is in this sense, Barthes argues, that “all official institutions of language are repeating machines: school, sports, advertising, popular songs, news, all continually repeat the same structure, the same meaning, often the same words: the stereotype is a political fact, the major figure of ideology.” This is what happens when Aliaga repeats a meme once used against him. He does not puncture meaning but reveals his own ideological blindness. He adopts the familiar as if it were new, and humiliates himself. The absolutely new, instead, confronts the familiar, and it is blissful and thrilling when it does so. But the new is always doomed to become familiar eventually. It is caught in this strange time warp, so familiar to us in the present. In response to this strange paradox, Barthes throws a peculiar and provocative accelerationist gesture: “There is only one way left to escape the alienation of present-day society: to retreat ahead of it”.
Barthes seems to insist we go back to the future. As in the film of the same name, when Marty McFly finds himself performing at his parent’s school dance, anachronistically shredding on his guitar, like Chuck Berry with a few Hendrix moves thrown in. He is caught up in his own mode of expression, exercising genre tropes he is abundantly familiar with, but unaware that the audience before him are gobsmacked. “I guess you guys aren’t ready for that yet”, he says when he realises that he has utterly exceeded the musical expectations of the occasion. “But your kids are gonna love it!” That is what is required of the revolutionary new, which is more than mere novelty. It must retreat ahead of itself – nothing less than engaging in a form of time-travel. We don’t know it when we see on most occasions. But it is that which we always declare “ahead of its time” in hindsight.
Of course, Barthes is not so wanton in his approach as Marty McFly. He also considers the other side of the new – not just shocking difference but mutative repetition; “one can make a claim for precisely the opposite”, he says – “repetition itself creates bliss.” One need only think of dance music or certain kinds of surrealist comedy and the dozens of examples of blissful repetition found therein. But this repetition is still done to excess. It still makes a mockery of pre-established value-structures. Barthes speaks of “obsessive rhythms, incantatory music, litanies, rites, and Buddhist nembutsu, etc.: to repeat excessively is to enter into loss, into the zero of the signified”. Capitalist repetition is of another order. “The bastard form of mass culture is humiliated repetition: content, ideological schema, the blurring of contradictions – these are repeated, but the superficial forms are varied: always new books, new programs, new films, news items” — new memes — “but always the same meaning.”
But Barthes also suggests that this humiliated repetition can be humiliated in itself. Isabel Fall’s appropriation of the “attack helicopter” meme, for instance, humiliated an over-used right-wing joke and made it radically new. There are strategies available to us that use capitalism’s humiliating categories against itself. But more often than not, right-wing meme appropriation is incapable of such a manoeuvre. They preach radical replication but only succeed in humiliating themselves, only making novel what was once new. (Even a try-hard left-wing meme culture falls into this trap, with the Zero Books crew in particular only capable of turning canonical theory into market novelty, whilst the new itself unfolds elsewhere.) They attempt to look like Marty McFly, but the rest of the world sees the tensions underneath. Just as Marty flexes his virtuoso chops, patronising the black backing band that they should try to keep up, we should note that they do so without a problem. The zero of the signified lurks in the background, playing rhythm, like black creators on TikTok who recently engaged in a cultural boycott, resisting the cultural appropriation of dances that they create, which are later picked up and monetised by white influencers.
This aspect of meme culture is more worthy of our attention than reheated 2016 culture wars. Indeed, memes require a certain negativity to function, a certain outsideness, and it is telling where that negativity most often emerges from. From words like “woke” and “Karen” to TikTok dances to the legacy of black Vine, the absolutely new predominantly emerges from below and, in our present cultural moment, from blackness. It does not come from those mechanisms of capture: “school, sports, advertising, popular songs, news”, to which we might add a whole number of other digital categories… But not all.
In a 2011 interview with Polish magazine Kronos, Ray Brassier shares a damning appraisal of the blogosphere. Asked about his “love affair” with the speculative realist movement – that is, his vague association and then disavowal of its aims and goals – he does not pull any punches.
I don’t believe the internet is an appropriate medium for serious philosophical debate; nor do I believe it is acceptable to try to concoct a philosophical movement online by using blogs to exploit the misguided enthusiasm of impressionable graduate students. I agree with Deleuze’s remark that ultimately the most basic task of philosophy is to impede stupidity, so I see little philosophical merit in a “movement” whose most single achievement thus far is to have generated an online orgy of stupidity.
Looking over the current state of accelerationism today — or “speculative realism”, “object-oriented ontology”, “patchwork”, or whatever other (relatively) recent post-Continental position has been boldly explored by certain groups of people online — it is hard not to feel like Brassier has been largely vindicated. From the collapse of accelerationism as a speculative realism politics to a formless mess, to the transformation of U/ACC into a Twitter badge for identifying an allegiance to a philosophy wholly misunderstood, to the prostitution of Mark Fisher into a log in the waves — a meme in the algorithms — of communicative capitalism, in all of its incarnations the blogosphere has seemed terminal as it struggles to resist the reterritorializing processes inherent within its own target audience. Considering the ultimate reterritorialization of accelerationism into a far-right mode, perhaps this infuriating process has finally proved fatal.
And yet, what remains clear is that, for all its work-in-progress flailing and meandering, when we actually look at what was said, resisting the present habit of retconning all arguments backwards and applying them to the twentieth century, we find that the blogosphere is far more consistent than it is often given credit for. In attempting to break with, not just the twentieth century but post-Kantian philosophy as a whole, it has persistently attempted to address questions of how it might forestall its own (re)capture by the system at large and its prejudices. And even though it has failed — not just once, but arguably twice — the political stakes of its questioning have only gotten more prescient. Because this is not just a crisis within accelerationism or the blogosphere more broadly but a crisis within philosophy and politics as a whole. Accelerationism, in its violent irony, has only served to demonstrate how disastrous a politics or philosophy of immanence can be if it does not remain vigilant to the ways the very processes it hopes to critique act upon its own fortunes.
This is to say that, if accelerationism’s greatest achievement is the generation of “an online orgy of stupidity”, this is largely down to its own accelerative nature. This philosophy’s initial questions and the secondary problems attached to them have fallen wholly by the wayside thanks to a process of abstraction that the accelerationists themselves were clearly not in control of. The blogosphere’s machinations are now so disparate and fast-paced, and its adjacent publishing industry so over-simplified and slow, it is difficult to curate a middle ground where this “orgy of stupidity” – nothing less than capitalism’s own hegemony of mediocrity – can be kept at bay. This is the problem of accelerationism, found both inside and out.
Brassier, then, is obviously not wrong, but it seems a little mean-spirited to focus our contempt on the internet alone. When these philosophical movements were first birthed, in the late 2000s, there was a vision of a bright new future emerging in all sorts of areas. In his book Post-Continental Philosophy: An Outline, for instance, John Mullarkey writes about how there was a new excitement in the air as disciplines, riding the ways of capitalism’s schizophrenic globalisation, came newly into contact with each other. “It is reassuring to see philosophy thinking with Leibniz and embryology and political resistance movements, or Plato and set theory and militant insurgency”, he writes, in reference to the work of Deleuze and Badiou respectively, newly embraced and utilitised in the Anglosphere, in being deemed wholly appropriate to the new age.
Though two philosophers with divergent views of the world, Mullarkey argues that what Deleuze and Badiou at least share is a rejection of transcendence in favour of a newly immanent kind of philosophy:
Philosophy has seemingly come back down to earth from the inconsequential heavens of transcendence. Immanence means relevance, even when that relevance comes through the abstractions of mathematics (Badious) or epistemology (Laruelle). As David Papineau put it, ‘nearly everybody nowadays wants to be a “naturalist”’. And everybody wants their ontology to be a political ontology too. But things are never so simple, alas.
Alas, they are not. That same year, Zero Books — having just released Capitalist Realism and a host of other titles that announced a new era of critical engagement outside of traditional and, more importantly, academic publishing — came out with Paul Ennis’s Post-Continental Voices, a collection of interviews with up-and-coming thinkers like Graham Harman, Ian Bogost and Stuart Elden. He talks to them about their journeys into philosophy, publishing and academia, and what they see as the future for all three, just over the horizon.
Graham Harman is characteristically self-centred, arguing that “the Continental philosophy of 2050 will be visibly descended from one or more of [the] branches” sprouting from contemporary conversations around Speculative Realism, including his own Object-Oriented Ontology. But the other interviewees in the volume cast their net a little wider, and are a little more speculative than just merely self-promoting.
They speak more broadly of a future interdisciplinarity. Levi Bryant is palpably excited that philosophy is “once again … discovering its others.” Adrian Ivakhiv argues that “young scholars with inter- or anti-disciplinary leanings” should rejoice that their oft-rejected tendencies will soon be recognised as “the way of the future”. “The future is in the hands of those who know how to work on shifting and unstable grounds”, he suggests — clearly unaware of how torturously true his statement is for an emerging precariat. Ian Bogost is a little more nuanced, but argues much the same point: “I don’t simply mean to rename the well-trod path of interdisciplinarity, but to suggest that philosophy is re-entering the world in a different way from its predecessors.” And yet, from the vantage point of 2021, it is Jeffrey Malpas whose tentative vision of the future feels the most accurate:
Much as I would like to see a more open, engaged, and vibrant form of philosophy developing that is not bound by ideology, such a hope seems overly optimistic, and it is certainly not helped by current developments in higher education in the UK or Australia.
Malpas is, of course, referring to the general upheaval of 2010, not only in higher education but in all forms of life for young people. Indeed, it was a pivotal year for me and my generation. I had just turned 18 at the end of 2009, but being able to (legally) drink was less exciting than the prospect of voting in my first general election. I wasn’t politically consciousness, by any means, but with most legal age limits easily circumvented, it was down to learning to drive and participating in democracy to provide a coming-of-age thrill – and I couldn’t afford the former.
I voted for the Liberal Democrats. There were well-thought-out reasons for doing so — we actually discussed the election at length in school — but, if I’m honest, the main reasons I remember being in favour of Nick Clegg’s party were the Lib Dems’ policy to legalise weed and block the government’s plan to treble university tuition fees. It was also an opportunity not to vote for the Labour Party. The 2000s were defined, even in my young mind, by the travesty that was the Iraq War. Getting rid of Blairism felt like a real responsibility for a new generation newly eligible to exercise their right to vote. I remember feeling like we’d been presented with an opportunity. This was a first, small step in shifting to a new world order.
In the end, the result of a hung parliament; David Cameron, leader of the Conservative Party, was sworn in as prime minister, entering into a coalition government with Clegg and his MPs. It was not the result that anyone my age wanted, but at least Clegg was there to reel Cameron in – or so we thought. By the end of the year, it was clear that we were all fucked, and we wouldn’t even be able to legally smoke in order to take the edge off.
In September 2010, I moved to Newport in Wales to start an undergraduate degree in photography and spent the next few months travelling back and forth to London at every opportunity in order to protest. (I’ve spoken about that period a few times before, most recently here.) I wasn’t directly affected by the trebling of fees – I was the last cohort to pay £3000 a year tuition – but I knew that, if I was any younger, I’d feel priced out of education in the years ahead. As I grew older, this near-miss just felt like a delay. I wanted to do postgraduate study but it increasingly felt like it was the preserve of a privileged few. Undertaking a Master’s degree six years later, that concern was proved correct. The utter evacuation of class as a discourse from everywhere but a few bold corners of postgraduate education was striking. (When Mark Fisher died in 2017, partway through my studies, it was all the more harrowing given his background, as he was one of the only lecturers who had a working class background to speak of, and one of the few lecturers who working class students felt comfortable talking to about their concerns and experiences. It was striking how low the quality of the conversation on campus about class plummeted once he was no longer around to steer it.)
With the past decade of navigating higher education (and life in general) in mind, the future that the post-Continental voices had predicted feels like a naive pipedream. By 2016, interdisciplinary courses were not constructed by design but by accident. My Master’s degree was, for example, exceeding interdisciplinary. Though an “arts” degree by name, it was theoretically focussed and, as such, welcomed philosophers, geographers, cultural studies people, designers, art historians, musicians, and more into its midst. This worked in my favour, personally speaking, as it allowed a twenty-something philosophically-curious late-bloomer to sidestep a flailing career in photography and arts administration to engage properly with philosophy without having to prove myself by retreading a pre-ordained academic pathway that meant spending £23,000 on an undergraduate do-over.
In truth, interdisciplinarity (and, indeed, anti-disciplinarity) was attractive for quite practical reasons too. The future was so uncertain, it was good to be able to adapt. (Something I did quite explicitly when the pandemic hit, pivoting on a dime from a job in arts administration into working as a proofreader and copyeditor — something I could not have done without taking that course.) But it is still readily apparent that the academic acceptance of interdisciplinarity was the direct result of course mergers and budget cuts, rather than progressive future-oriented thinking.
This is to say that, whilst a shift to the interdisciplinary appropriately transformed certain courses from specialties into umbrellas, it also reduced those subsumed underneath from courses to modules. A potentially positive change in principle, students were still limited by academic bureaucracy, able to choose just two modules to participate in for a grade. Auditing classes was a possibility but schedules were hardly drawn up with this in mind. Interdisciplinarity was not encouraged for the sake of student fulfilment, intellectual exploration or even careerist adaptability, but as a cost-cutting measure, pure and simple. The idea that there was an abundance of choice was undermined by more general administrative restrictions, meaning there was literally too much to choose from. That something resembling an interdisciplinarity mindset survived these adaptations was down to the creative thinking of the precariously-employed lecturers who worked there, but even these attempts to make the best of a bad situation were repeatedly trodden on by clueless senior management.
I could go on, but the point is that, whilst the internet may be an “orgy of stupidity”, but it is nonetheless true that academia’s turn towards neoliberal interdisciplinarity has transformed it into a clusterfuck of mediocrity too. We are more time-poor, and poor in just about every other way too. But the questions once asked and entertained on the old blogosphere are still available to us. What became an orgy of stupidity did not start that way. The strange thing is that post-Continental philosophy feels like a collective hallucination now, if it is remembered at all. We forgot all about its heresies, unorthodoxies and challenges to the status quo. We also forgot its exceedingly political mindset, which sought to truly update philosophy for a new era, rebuilding it from the ground up, creating new modulations, not simply applying the past haphazardly to the present, as if using the canonical thought of old to understand the postmodern present did anything other than capture and neutralise it..
We only need return to Zero Books for an example of this, with its own downwards trajectory a perfect example of the poverty forced upon us. The recent dramas around its turn towards aesthetics and politics are anemic compared to what the old blogosphere had in mind. Consider Steven Shaviro, who has written some excellent books on aesthetics for Zero and elsewhere. In an old blogpost, he writes about how we must be prepared to “think outside the limits of thought that have been defined and legislated by neoliberal capital”. However, in a moment of reflexivity, he cautioned against giving the impression that his own interests were the best way to do this: “I’m not trying to claim that aesthetics resolves this in any way”, he says, “only that a radical rethinking of aesthetics is necessary, in order to re-find the values that Adorno and Marcuse found in the aesthetic, given that their direct hopes have been rendered obsolete by the expansion of the forces they described and deplored to degrees of exacerbation that they never imagined.” (The Urbanomic Redactions edition, Speculative Aesthetics, shows just how far-out this conversation was going.) Someone should tell that to the current Zero Books crowd, but it is just as true for the rest of us around these parts who continue to focus on the arguments of the past, even the very recent past, without any sort of acknowledgement that the dreams we parrot in the present all failed miserably.
The first blogosphere embraced this as an opportunity to start again. But we seldom talk about them in this way. They have essentially been forgotten. We know the names of those involved, of course, and love to throw them around, but who can tell us anything about what they argued over or fought for?
Take Mark Fisher. The other day, on Twitter, I was sharing sections from a email conversation Mark had with Graham Harman, which he shared on his k-punk blog. Though essentially Mark answering questions that Harman has about the recently released Capitalist Realism, he totally captured the philosophical discussions going on around that book’s release and the release of a dozen other books in the years that followed. He presents Capitalist Realism as an attempt to wake a generation from its slumber, and specifically to make them aware of the battle lines he and others were fighting over. (A brief conversation with Vincent helped demonstrate how different his terms were to those now attributed to him by others.)
A few years earlier, for instance, in 2005, totally caught up in the Anglosphere’s Badiou-fever, following the English translation of Being & Event that same year, Fisher argued that:
The most productive area of conceptual discordance is that between Badiou and Deleuze-Guattari. Perhaps we’re in a position to use each to decode the blind spots of the other. Deleuze-Guattari have never been properly assimilated into Continentalism (the sad vitalist zombie that stalks the halls of the academy in their name is testament to that) because they too are philosophers of commitment, in which philosophy knows its place: as a theory of action, not a substitute for it.
But what of Badiou? I think Mark was still figuring him out. He roused him from his complacent Nineties cyber-Deleuzianism and into a newly political mode. But then, in 2010 and in response to Harman, we find the Badiou backlash in full effect. After the financial crash, it is revealed that neither Deleuze-Guattari nor Badiou were fully appropriate to that moment. Still, he casts both in relief to find the present in remainder:
No doubt, the Cultural Revolution of the 60s to which Badiou pledges allegiance had to happen — but we can’t keep acting as if the problem is a centralizing State or a Stalinist Party structure. At the same time, no simple return to a centralizing State or a strong party is possible either — which is why so many of Zizek’s political provocations amount to what Alex Williams calls “comedy Stalinism”. In many ways, I would argue that the “politics of the event” articulated (albeit very differently, of course) by Deleuze and Badiou is an elaborate apologetics for an actual political failure. The injunctions to keep faith with the event, the claim that Chronos doesn’t matter, only the aeonic event: both are a kind of theology of consolation, akin to Paul’s shifting in position when it became clear that Christ was not going to return immediately. Obligatory affirmationism conceals a surreptitious melancholy.
For me, Badiou’s value lies in his rousing encouragement for anti-capitalist struggle, his contempt for “democratic materialism” (the postmodern ontology of bodies and languages), what Peter Hallward characterises as the rejection of worldliness, and his periodisation of what we are living through as a moment of Restoration. But the central problem with Badiou’s philosophy as I understand it is its retrospective quality. Everything has already happened. It is literally preaching to the converted. The irony, of course, is … that it is hard to imagine anyone actually being converted by Badiou. But it is possible to imagine Zizek converting people; indeed, he had that effect on me, rousing me from my neoliberal slumber.
No doubt Badiou describes a certain kind of militant phenomenology… but what use are these descriptions? All anyone can say is, “yes, that’s what it’s like to attain a militant subjectivity”. But it seems to me that the important questions are how to engender that kind of subjectivity. What practical steps can be taken? Again, that’s what I appreciate in Nick Srnicek’s approach, the way that he instrumentalizes actor-network theory for leftist purposes. These questions are key: what are the actors in any particular network? How can these actors be affected? How can dominant networks be decomposed and new networks installed?
Where have these questions gone? Subsumed within the orgy of stupidity? Maybe. But even those smart enough to know better seem to have forgotten the philosophical character of the present. It’s always the same old faces, reanimated to spread their wisdom. Are they right? Are they applicable to today? I don’t think such questions ever get asked. It seems to me that we only reach back, at our most heretical, to Nick Land. He is the last “speculative realist” we seem capable of remembering and engaging with. Where did the post-Land blogosphere go, proliferating speculative philosophies far beyond his reactionary (post)modernism. Given the shifting grounds on which we live, it seems to me like we’ve reached back to the last stable period of thought that we can canonically — that is virtually, if not actually — remember. It speaks volumes that it is the Nineties.
Where were you in ’92? I’d just been born, right at the end of history. I find it more productive, these days, to ask myself where I was in 2008-2010, with a newborn political consciousness emerging under the shadow of an accelerationist conversation online, triggering a meandering path that, over the next decade, would come full circle back to the blogosphere. From the vantage point of 2021, I look back on 2008 as my Year Zero. That was the last time I remember the world as a whole swirling with an all-encompassing potential, only to dissipate into the disappointment of the decade ahead. It was a year of failures and banal phase-shifts, of “Yes We Can”, but there was a fury underneath that wasn’t sated by Obama’s pleasantries.
This is not to say we need to go back to that period and live there forever. We need only ask what happened to the questions asked of that time, and how their forms might have mutated since. That should be the only question worth a damn for anyone in the present orgy of stupidity, which does nothing but shield us from the pain of the present by keeping our thoughts firmly in the twentieth century.
I was very touched to get a pingback here. I feel a lot less open online these days, and I think a lot of that probably has something to do with the pandemic. A blogpost I wrote a year before lockdown, on the “value” of openness, is acknowledged as partial inspiration here, but Error gives me a taste of my own medicine as a result. I’d largely lost this drive towards the horizon over the last year, and though I can still blog without much internal resistance, it’s about the only outlet I have that doesn’t feel blocked and curtailed by the past eighteen months. So it means a lot to hear that feeling of blogger’s drive towards the horizon be discussed by someone else, in a way that reminds me of its importance as well. Thank you for that.
My COVID situation has been, admittedly, not bad. I’ve been working from home, in a mostly empty house, in a city where I don’t have much of a network yet. I’ve been able to limit, likely more than most, my risk of exposure to the virus. Yet I feel stupider than I did over a year ago, even though I’ve passed the academic checkpoints I’ve needed to in my degree. The predominant feeling of COVID isolation has been horizonlessness. I noticed this early. A few months into the pandemic I commented on a post in the grad philosophy Facebook group, half-jokingly, that I finally understood why Spinoza defined sadness as a decrease in capacity to act.
The notion of a horizon is one I’ve encountered predominantly in phenomenology, mostly in Maurice Merleau-Ponty. It’s a word many phenomenologists use without defining or characterizing, maybe because they think it’s meaning is obvious. And it’s true the definition isn’t hard to grasp. Like the literal horizon, a phenomenological horizon is something within perception that suggests that there is more to see. For Merleau-Ponty, when we see the edge of an object we don’t just see the limit to which that object extends in our perception (like the black lines of a cartoon character) but, in addition, we see that there is more to see, that the object has a side turned away from us. Merleau-Ponty’s more controversial claim is that this is not a cognitive or rational inference that we make: i.e. In my past experience, objects that have edges tend to have back-sides that I could potentially see, X has an edge, thus X has a back-side that I could potentially see. Instead, Merleau-Ponty claims that horizons are a feature of the structure of perception itself. I see immediately that there is more to see.
The concept of a horizon occasionally gets extended to describe all cases where we intuitively sense there is something more. This is more like what I mean when I say the isolation from COVID lockdowns produces a feeling of horizonlessness, or, more accurately, a great shrinking of horizons. It’s not that I literally see fewer horizons (though this is perhaps also true, since I’m leaving my house less), but that there are now fewer promises of something more.
I’ve been equivocating here. I’ve talked about Spinoza’s view of sadness as a reduction in capacity to act and Merleau-Ponty’s notion of the presence or absence of horizons as if they were the same thing. A feature of Merleau-Ponty’s account of horizons is that they’re immediate to perception; they’re not conceptually or rationally mediated. Just as I don’t deduce or induce that the object in front of me is red (its redness is, as is sometimes said, given to me), I don’t make a conceptual inference that the object in front of me has horizons. But is that still true when we start talking about the bodily world of capacities to act?
Capitalist Realism is ultimately focused on … the ways that public institutions that haven’t and likely won’t be privatized have been forced … to participate in simulated markets, where a rigorous regime of testing on a set of metrics replaces the invisible hand of the market. It’s a governmental gambit driven at once by a desire to reduce funding across the board and to convince voters that they are taking the efficacy of public institutions very seriously. Since it couldn’t / can’t actually expose some public institutions to market forces through opening competition or privatization, New Labour established (and continues to establish) pseudo-markets, fake market-like games, for public institutions to compete in in order to obtain funding.
This is interesting, because it is hones right in on how capitalism functions ideologically, and must translate things that do not fully coincide with it into terms it can understand. The NHS is an obvious example of the sort of institution in mind. Though it has been threatened with privatisation for decades, the NHS is essentially forced to participate in a simulated marketplace where other metrics replace the profit motive. “NHS waiting times” is surely the most overused metric of my lifetime. More recently, I’ve noticed how friends who work for the NHS have it beaten into them to refer to patients, et al. as “service users” — a clear pseudonym for customers, with the suggestion being that “service user” satisfaction becomes another metric used that is adjacent to capitalist nomenclatures, as if we are entering a future where hospital funding will be allocated based on TripAdvisor reviews.
I’ve recently been quite intrigued by how this sort of ideological processing is so widespread, and yet perfectly predicted by Lyotard in his book The Postmodern Condition. I particularly enjoy his discussion of the work of Talcott Parsons and sociological systems theory. For Lyotard, the computerisation of society has led to the flattening of ideology to whatever can be computed by neoliberal management systems. It’s fascinating, in hindsight, that this was written even prior to the fall of the Soviet Union. It didn’t matter if your socioeconomic system was capitalist or socialist or communist, in Parsons’ technocratic language either a “process or set of conditions either ‘contributes’ to the maintenance (or development) of the system or it is ‘dysfunctional’ in that it detracts from the integration, effectiveness, etc., of the system.” Either it fits or it doesn’t fit. And if it doesn’t fit, it should be ejected. This leads to a “perfectly sealed circle of facts and interpretations” that you either adapt to or are destroyed by.
“Traditional” theory is always in danger of being incorporated into the programming of the social whole as a simple tool for the optimization of its performance; this is because its desire for a unitary and totalizing truth lends itself to the unitary and totalizing practice of the system’s managers. “Critical theory”, based on a principle of dualism and wary of syntheses and reconciliations, should be in a position to avoid this fate. What guides Marxism, then, is a different model of society, and a different conception of the function of the knowledge that can be produced by society and acquired from it. This model was born of the struggles accompanying the process of capitalism’s encroachment upon traditional civil societies.
If we might translate Lyotard’s prose a little, the dualism of critical theory amounts to a theory of class antagonism. System’s managers, previously known as the bourgeoisie or owners of the means of production, made all the rules. They legislate what takes place on the factory floor and in what manner. They induce discipline and necessitate servitude. Critical theory is, by necessity, a theory from below. It expresses and gives form to the material conditions of those who do not make the rules or otherwise generate knowledge but are nonetheless subjected to the knowledge and rules of others. Critical theory offers an alternative perspective, and alternative model of society, that is other to the received wisdom of the bourgeoisie.
The computerisation of society complicates things. Rules and decision-making are abstracted. Bureaucrats hide behind the limited purview of computer code. The reality is, unfortunately, not far off a Little Britain sketch — and, of course, it is a sketch that, at one point, takes places in a hospital:
Ideology itself is abstracted. Ideology is something that other people have, like an illness that invades from outside and frustrates social functioning. Reality is “common sense”; the closed system of paranoid reason. It’s not something that I think, it is a given, a truth provided to me by an impartial system that may be managed but ultimately manages me.
Remember Douglas Adams’ joke about the meaning life? Humanity spends centuries constructing the most powerful supercomputer in the universe, all so it can ask it, once and for all, what the meaning of life is. The computer takes a few more centuries to answer the question and, when it does, the answer it gives is “42”. The joke is telling. Is it funny because the computer gives an ultimately abstract answer to an ultimately abstract question? Or is it funny because the answer is accurate but it is unintelligible outside our limited ideological purview? Or, on the contrary, is it the computer that is so limited by binary code that it can only express its answer in numerical terms?
There are numerous ways of thinking about Adams’ joke, but the sad truth is that we have constructed a globalised management system to provide meaning to our lives, and the answer it keeps spitting out is “live laugh love.”
I was thinking about all of this today in light of Helen Joyce’s recent book Trans: When Ideology Meets Reality. I haven’t read it but I have enjoyed watching Twitter masochists unearth the extent to which it is it is rooted in bad science and poor research (in explicit contrast to the sycophantic endorsements on its cover).
I don’t want to read it not only because I’m not a TERF but because the title of the book tells me everything I could possibly want to know, and this has been confirmed by the response from some of its defenders. Many have criticised it as a typically TERF project written completely in isolation and does not speak to or feature comment from anyone who is trans themselves. The response is that it is not a book about trans lives but about policy. Twitter user @DamselDystopia summed the lunacy of this up best:
“It’s not about trans lives it’s about policy!”
Uh huh, policy on what, exactly? Monetary supply? Motorway bypasses? Or by any chance policy on trans people’s healthcare and use of public space?
But even without this argument being a pathetic exercise in obfuscation, it only further undermines the title of the book itself. If “reality” is here defined as policy and legislation, and not the lived experiences of certain individuals, it begs the question: how the hell do TERFs define “ideology”?
TERF ideology is gender realism and nothing more. Faced with the public acceptance of trans lives — the de-privatisation of queer experience — they create a culture war, nothing less than a simulated market within which they can explore their ideas. In simulated markets, the house always wins — they set the agenda, reduce funding for those in need, and implement certain metrics of their choosing in order to shore-up the value-structure they believe is under attack.
When people say “there is no liberation without trans liberation”, this is why. The ideological sleights of hand that trans people must negotiate — and they’re increasingly less sleight, let’s be honest — go by a playbook that is integral to the heart of capitalism itself.
“Gender critical” discourse is not critical in the slightest. It cannot be. It is the very opposite of a critical theory, which must rely on anti-Semitic and “globalist” conspiracy theories to create the impression it is not punching down. But they are no different from bourgeois oppressors — they manage gender, instead of the factory floor.
Very sad to hear about the death of Dawn Foster yesterday. I did not know Dawn but genuinely loved her on Twitter. She was one of the most consistently entertaining and insightful people on the platform. It was fitting, then, that she was trending well into the evening. It was a bittersweet moment, and very reminiscent of when we lost both Mark Fisher and David Graeber. It is always wonderful to see how much impact a person’s work can have on people’s sense of the world and, particularly in Dawn’s case, their class consciousness.
I was actually just revisiting Dawn’s book Lean Out the other day. After Repeater Books published its 100th book, The Melancholia of Class by Cynthia Cruz, I was remined that Dawn’s book was the first. I ended up pointing to it in a footnote for a book chapter I’m working on, wondering about the contentious history of the book’s sentiment. Everyone from Jacques Lacan and Luce Irigaray to Sadie Plant as made some argument about the radicality of a feminist leaning-out. It’s a miracle, in some ways, that someone like Dawn was able to make her case so poignantly and, indeed, make it go mainstream when you consider how abstruse many of those other names are… (There is a great interview with her on Novara Mediaabout the book too, for the curious.)
I have nothing more to say that others haven’t already said on Twitter, but I did want to clip a few things that I came across over there, for posterity and for the unfamiliar if nothing else.
…Which led me down a bit of a rabbit hole, where I came across this clip for Good Morning Britain, in which Dawn calls Piers Morgan “morally reprehensible” without so much as blinking. (Unfortunately, the audio is broken from about 4 minutes onwards.)
That fearlessness seems to be the most prominent thing Dawn will be remembered for — and that really is something to be remember for. The best example, Twitter seemed to unanimously agree, was an article for the Guardian she wrote in 2019 around the UK general election, arguing that “If Tom Watson had guts, he would quit Labour. Instead he is weakening the party”. (Apparently, Dawn was let go as a columnist not long after publishing it.)
Bask in the glory of her utter demolition of the UK’s hegemonic centrism:
In 1911 in Prague, the writer Jaroslav Hašek formed the satirical group The Party of Moderate Progress Within the Bounds of the Law, mocking the overtly accommodating tendencies of the Czech Social Democrats. Centrists might balk at Hašek’s manifesto promises, including mandatory alcoholism and the institutionalisation of feeble-minded MPs, but the tendency he mocked remains: proposing political reform, but ever so slowly; refusing to grapple with the speed with which the world is changing, or the fact the economy has for four decades been working for few but the wealthiest.
Centrist thinking is focused on two false premises. The first is that the 2012 London Olympic ceremony represented an idyllic high-point of culture and unity in the UK, rather than occurring amid the brutal onslaught of austerity, with food bank use growing and the bedroom tax ruining lives. The second is that the UK became divided by Brexit and the 2016 vote, rather than it being a symptom of long-term problems: the decline of industry and the public sector begun by Margaret Thatcher and continued by Tony Blair and David Cameron; vast inequality of opportunity, wealth and health; and the number of people being routinely ignored in a system with a huge democratic and electoral deficit.
The 2017 manifesto helped Labour to increase its vote share because it addressed so many of the problems faced by people and communities across the country. Labour won more seats, in spite of people like Tom Watson and his ideological bedfellows. Many centrist Labour MPs desperately wanted the party to lose heavily so they could depose Jeremy Corbyn. They still do. A Labour government with Corbyn in charge is less preferable to them than an indefinite Tory government.
If Tom Watson – the MP who said he “lost sleep” after Phil Woolas was found guilty of lying in racially inflammatory leaflets and stripped of his seat – had guts, he would quit the party and try to prove that his ideas have electoral traction. Yet, as he has probably discovered, it is hard to come up with bold and original ideas that benefit the electorate and prove popular with voters: it is far easier to stay in a party, wrecking it week by week, hoping to terminally undermine the leader and then inherit the ruins.
But the end result of Watson et al’s constant attacks will not be electoral success under another Labour leader, but a Tory victory. And the people who need a Labour government to change their lives and communities are unlikely to forgive people like him.
The children of [acceleration], you can run into them all over the place, even if they are not aware who they are, and each country produces them in its own way. Their situation is not great. These are not young executives. They are strangely indifferent, and for that reason they are in the right frame of mind. They have stopped being demanding or narcissistic, but they know perfectly well that there is nothing today that corresponds to their subjectivity, to their potential energy. They even know that all current reforms are rather directed against them. They are determined to hang on to something possible. […]
There can only be a creative solution. These are the creative redeployments that would contribute to a resolution of the current crisis and that would take over where a generalised [acceleration], an amplified bifurcation or fluctuation, left off.
The Tomorrow War is an intriguing film. [Major spoilers below.] It is something of an amalgamation of World War Z and Edge of Tomorrow, but it is also a fun and dynamic alien invasion movie in its own right. It has also crystallised something for me that I’ve felt for years but never quite known how to articulate…
When I was growing up, my grandpa loved old war movies. Mostly “prisoner of war” stuff, like Colditz and The Great Escape. I liked them too. They were often great family viewing. (At least in Britain, but let’s not go there…)
As I grew older, I remember being quite surprised about his love of such films. He fought in World War II, after all, and served as a navigator for the RAF. But he never talked about it. I don’t think he was under any illusion that war was something to enjoy or remember fondly. And yet, there was something about seeing this kind of romantic vision of wartime that was cathartic or calming for him, I think. It allowed him to relive what must have been one of the most affecting times of his life, but with a certain amount of distance and through a certain kind of soft-focus filter. He was just one man in a nation of men who, after the war, needed to tell themselves a certain kind of story.
In more recent years, it is interesting to see how that kind of film has developed with a new kind of veteran in mind. Ever since American Sniper came out in 2014, after a generation of veterans were starting to settle back into civilian life post-9/11 and the war on terror, there have been periodic film releases that insert a Chris Pratt or a Bradley Cooper into the mix — basically any young white contemporary American everyman — in order to tell a story (whether explicitly or implicitly) about duty and responsibility and, perhaps most importantly, the emotional toil of coming home. These films aren’t dramatizing what happened over there — this isn’t Jarhead or Black Hawk Down or a film from that generation of war movie — but what happens when it’s (supposed to be) all over. They are essentially PTSD films, told largely through flashbacks.
I have no problem with that kind of narrative. I find American militarism pretty nauseating, truth be told, but I do have a soft spot for films that explore its complexities. (I’ve seen American Sniper more times than I’d care to admit, actually — the result of a hangover from a childhood obsession with Clint Eastwood and his particular brand of reactionary anti-hero, I think.) As films, they can actually be quite charming, even if they are clearly made with a certain kind of ideological standpoint in mind. But what is telling, in consuming this sort of movie, is watching how that standpoint changes over time.
The Tomorrow War is fascinating in this regard, mainly because, through its time-travel drama, it facilitates a major subplot that explores the impact of intergenerational PTSD quite specifically.
Chris Pratt is a veteran of the Iraq War trying to kickstart a new life and make something of himself after the military. But it’s not going very well for him and he’s getting very sad and angry about it. When we meet him, he’s just walked into a house party he’s supposedly hosting, but he doesn’t interact with anyone there. He’s like a ghost, almost, with no time for anyone but his wife and daughter and, most significantly, some people on the end of a phoneline who might be offering him his dream job. But he doesn’t get it. And he takes the rejection surprisingly badly. The world fades out around him, as if this setback in his career is nonetheless taking him back somewhere much darker. There’s a dark sadness within him that is rising.
Alongside Pratt’s clearly undiagnosed PTSD, we learn about how he’s also deeply resentful of his father, played by JK Simmons. When he arrives home, he’s given an unopened Christmas card from the man, which he throws in the bin. (Though the narrative suggests Pratt goes dark over his failed job interview, this minor detail looms ever larger as the story progresses, as if the Christmas card is the real trigger for him.) Simmons, we later learn, came back from ‘Nam a broken man and wasn’t really present when Pratt needed him most. They’re estranged and not really on speaking terms, largely because Pratt refuses to engage with him.
Then the aliens arrive. Pratt goes into the future to fight a war, and whilst he’s there he meets his daughter, fully grown and now a Colonel fighting off the invaders — and she resents him. She keeps him at a distance and later tells him some home truths (albeit related to a life he hasn’t lived yet). She tells a story about how, when she got older, he and her mother separated and he was a bit of a mess. In the end, just seven years later, she watched him die following a car crash, after they’d been estranged for years. It is a case of “like father like son”, as it turns out. Whatever was eating Pratt when we first met him, devoured him whole a few years later. This disturbs Pratt greatly.
But something also clicks for him. Suddenly, you see this intergenerational picture being painted. Post-‘Nam dad is followed by post-Iraq son, and tomorrow war daughter isn’t really having any of it. Later, when Pratt is unceremoniously sent back to the past, having watched his future daughter die, he sets out on a redemption mission to destroy the aliens — frozen in ice on the Russian tundra, as it turns out — in order to make sure the war never happens and his daughter never has to die. But in the process, he ropes in his Dad, and together they’re two shaken veterans — one of them maybe an alcoholic — doing what they unfortunately do best and trying to save the world.
The psychological picture painted here is fascinating. None of this really takes precedence. It is all back story; little details that paint a big picture, which is nonetheless a familial backdrop to a big spectacular alien invasion movie. But these little details change the film in quite a profound way, I think.
Despite how it might sound, this isn’t quite the gung-ho American militarism we’ve come to expect. It doesn’t have much ideological pomp about American exceptionalism, filling its role as the world’s police force. Pratt is sent on a suicide mission into a war that America (but also the world) is definitely losing. It’s Vietnam, yeah, but it’s also the Middle East. But then, the aliens are not the Viet Cong or the Taliban. This isn’t a fantasy do-over, winning the war that was previously lost. This is a film about a band of troubled veterans who truly want to redeem themselves, haunted by the things they’ve done or the world they might have created through their actions. This is a band of veterans turning the tables. A Vietnam war vet and an Iraq war vet fighting off an invading species. This isn’t Predator, with a tank-like Schwarzenegger fighting off the single alien guerrilla, getting his own back on the enemy and securing the cathartic victory otherwise denied him (although the aliens in The Tomorrow World do look like the monstrous lovechildren of a xenomorph and a Predator). This is a film about vets redeeming themselves by fighting off a hoard of (notably white) invaders, rather than being one of them. It’s a film about war vets getting the sharp and sour taste of their own medicine, and wanting to use the time they have left to fight off an invading force. It’s a film about war vets stopping a war from ever taking place.
The Tomorrow World feels like a film for anti-war war vets, in this regard, dressed up as an overblown alien invasion movie. This feels like burnt-out American militarism creating a narrative where it gets to save the world from itself.
Understood in the most general terms as representations of ourselves, self-portraits are among the oldest art there is. The Cueva de las Manos (Cave of Hands) in Argentina, for instance, contains dozens of stencilled handprints that are almost 10,000 years old, and artists have been using themselves as models or tools for their paintings ever since
But the owners of the hands in that Argentinian cave would hardly recognise the selves depicted in the portraits of the modern era. Indeed, there is no “self” as such represented on that wall. What we see is a group, a collective, a community. The “self” of a self-portrait, on the contrary, is something quite specific.
Philosophically speaking, the “self” is an abstract and heuristic concept for our experience of ourselves as individuals. Despite what we might now assume, we have not always thought of ourselves in this way. That we have forgotten our old senses of self is telling, however. The “self” is such a powerful and intoxicating concept that it overrides and manipulates all forms of self-understanding that came before it. But it is only by understanding this shift that we can appreciate the revolutionary stature of the self-portrait when it first emerged before us.
The Delphic motto “know thyself”, for example, is one of the oldest and most enduring sentiments expressed within Western thought. Discussed repeatedly by Plato, in no less than six of his dialogues, it served as a cultural touchstone long before even he put it on the page. But back then, to “know thyself” typically meant to know one’s place in the general order of things. It was a move away from individuation, as a source of ignorance and as a product of fear and isolation.
We can see this position adopted in other examples of ancient Western culture as well. Consider Sophocles’ most famous Theban play, Oedipus Rex, which was first performed in 429 BC, during the same decade it is estimated Plato was born in. Despite its age, the story of Oedipus has retained a considerable if anachronistic influence over modern conceptions of the self, ever since it was utilised by Sigmund Freud as one of the founding allegories of psychoanalysis. For Freud, the Oedipus Complex was his term for that strange and fraught process of self-definition, when we come to appreciate, often through tantrums of inchoate sexual jealousy, that we are ourselves distinct beings and are in competition with others for our mothers’ attentions.
However, although we interpret it very differently today, Oedipus’s quest is hardly a story of self-discovery and individuation. Initially, there is little question, in Oedipus’s mind at least, of who he is as an individual; at the beginning of the play, he could not be surer of this. His true self is only uncovered when he fully understands his relations to those around him. The secret to be uncovered is, instead, who his mother and father are. It is not Oedipus’s true self but his true place in the social order that has been obscured from him.
Later philosophical conceptions of the self differed from this considerably, even though they often retained an interest in this ancient source material, as Freud’s particular reading of Oedipus Rex already suggests. Even Plato’s discussions of “knowing thyself”, often mentioned in the context of governance and statecraft, were later used to legitimate the authority of liberal governments in Renaissance Europe, inverting the tale of Sophocles’ doomed king to suggest that the self is not be so easily reconstructed from our social relations.
Though it is, of course, influenced by those around us, the self is essentially our understanding of those characteristics that are innate to us alone. It is what is left of us when we strip back everything else that is otherwise shared. Intriguingly, this understanding of the self is only slightly younger than the self-portrait as an artform; both of which came into common parlance and practice towards the end of the Middle Ages.
For philosophers and political theorists at that time, the difference between the individual self and a collective subject was a novel but important distinction to make. It was Rene Descartes, writing in the 1630s, who first insisted upon such a distinction for philosophy. In his influential Discourse on Method, an autobiographical treatise on the very nature of thought and reason, Descartes hoped to provide a new methodology for separating truth from falsehood. To do this, he stripped back everything that, he believed, could not be trusted. Approaching reality with a radical doubt, he began to pretend “that everything that had ever entered my mind was no more true than the illusions of my dreams.” This included information gathered by the senses and just about everything else that came into the mind from the outside world. When all of this was discounted, Descartes was left with one thing – that is, the “thing” that thinks. “I noticed that, during the time I wanted thus to think that everything was false, it was necessary that I, who thought thus, be something.”I think, therefore I am was his resulting declaration, and with that he established the self “as the first principle of the philosophy I was seeking.”
This foundation was soon extended into other areas of thought as well. The politics of liberalism were also formalised at this time, for example, and were similarly built on a new conception of individual liberty and rights – the self as a first principle for politics also. A few decades after the publication of Discourse on Method, in his Essay Concerning Human Understanding, John Locke echoes Descartes’ philosophical position, writing that the “Self is that conscious thinking thing … which is sensible, or conscious of pleasure and pain, capable of happiness or misery, and so is concerned for itself, as far as that consciousness extends.” This Cartesian foundation nonetheless responds to certain political ideals. It turns out that, for Locke, this consciousness can extend quite far indeed, depending on your social status. In fact, by Locke’s measure, not every living thing was “conscious” of itself in the same way. As a result, though much of his work pays lip service to universal freedoms to be enjoy by all individuals, this was not always true in practice, especially by today’s standards.
Locke argues that the word person – his supposedly “forensic term” for the self – “belongs only to intelligent agents capable of a law, and happiness and misery.” To be a person, then, echoing Descartes, is to possess a form of consciousness that can reason with itself; that can reflexively ascertain itself as conscious. But, in Locke’s hands, this was not the same sentiment as “I think, therefore I am.” Locke instead positioned the self as a reflexive being that thinks in accordance with reason. Rather than the reflexive self being a foundation upon which reason can take place, the cart is put before the horse. The self doesn’t just reason – it is fundamentally reasonable.
Some of Locke’s resulting conclusions are relatively innocuous. Under his criteria, an animal is not a person, for example, because animals do not have laws or experience emotions in the same way that humans do. (Something we are only more recently starting to challenge.) But neither, in Locke’s view, do supposedly uncivilised persons, whose rights do not warrant the same respect as persons from more “reasonable” societies. This suggestion was very influential, and particularly disastrous given Locke’s political influence over the colonisation of North America – an influence that can be seen explicitly in historical studies of the Transatlantic Slave Trade, during which the emotions of slaves transported to the New World, clearly expressing trauma and grief, were ignored, denounced, or simply not perceived.
The political impact of Locke’s “self” did not stop there, however. With a little help from Thomas Hobbes and his 1651 work Leviathan, “the self” also became a term for a kind of individual sovereignty, analogous to that of “the nation-state”. Self-knowledge was less defined by what we could be most sure of, as in Descartes’ formulation, and more by what we can claim possession of – whether that be the mind or the land underneath our feet. In this context, Descartes’ “I think, therefore I am” was soon extended into the realm of governance and property rights, making “I own, therefore I am” a more accurate founding doctrine for the politics of classical liberalism, settler-colonialism and, a few centuries later, neoliberal capitalism as well.
Whether in theory or in practice, it was already clear to many that “the self” was not the best foundation for a new era of thought and commerce. As such, Cartesianism, liberalism, and their legacies continue to be challenged by philosophers and political theorists to this day. However, given the ever-peculiar experience of being a conscious subject, Cartesian doubt remains an attractive starting point for many. Geopolitically, it continues to inform settler-colonial projects like the Israeli occupation of Palestine, for example. Pop-culturally, Descartes’ questioning of the existence of a mind-independent can be found in everything from Nineties Hollywood blockbuster The Matrix to a 2020 hit single by pop star Billie Eilish.
And yet, despite this persistent influence, Descartes’ supposedly novel conception of the self had already been subjected to considerable scrutiny in the arts by the time he wrote his Discourse on Method. In fact, the first self-portraits emerged around a century before Descartes’ birth, and by the time his thesis was published, the self had already been openly investigated, even mocked, as an unstable but nonetheless generative concern in various artistic movements.
What is particularly notable about these early self-portraits is that they were often knowing attempts to depict what Jacques Lacan would (much) later call the “ideal-I”. This ideal is formulated during what Lacan calls “the mirror stage”, a process during which a child first adopts an external self-image, and therefore a mental representation of the “I”. But this “I” is often idealised, in that it is generally a stable mental conception. Our bodies are, of course, not stable, and so these ideals shift and adapt as we grow and age, but the mental conception of ourselves is forever out of reach.
Though Lacan would not theorise the mirror stage until the mid-twentieth century, it is a process analogous to the developments in self-perception that occurred during the European Renaissance, when artists began to consider themselves in an entirely new way. Their ideal selves, newly depicted on canvas, seldom corresponded to reality either, but this often did not matter. To depict an ideal and improve upon reality was instead seen as a virtue by many, as if to be able to paint something in a form more beautiful than nature was evidence of human exceptionalism and our capacity for self-transformation. This belief influenced a new Renaissance humanism whilst, for others, it demonstrated our direct connection to the divine. This attitude was as present in the self-portraits of the era as it was in Renaissance landscapes and still lives.
The most famous examples of an artist depicting their “ideal-I” can be found in the works of Albrecht Dürer, who produced some of the most notable self-portraits of the 1500s. Though there were self-portraits before his, no other artist produced so many. The writer John Berger went so far as to declare Dürer “the first painter to be obsessed by his own image.”
Among his plethora of selfies, Dürer most famously painted an immensely handsome portrait of himself that was so popular it eventually went on display in the town hall of Nuremberg, Germany, where the artist was born, lived and eventually died. “I, Albrecht Dürer of Nuremberg, painted myself thus, with undying colour, at the age of twenty-eight”, an inscription on the canvas reads. It is a painting layered with unsubtle symbolism and amusing strokes of self-aggrandisement. Not only does Dürer look like a classic depiction of Jesus Christ, his initials – a stylised “AD” – double up as both his signature and an allusion to the calendric label for those years following the birth of Christ: “Anno Domini”. Latin for “in the year of our lord”, it is unclear who “our lord” is supposed to be – Christ or Dürer himself.
Looking at this portrait today, one might expect Dürer’s self-image to be deemed sacrilegious, and most modern descriptions of the painting do mock it for the artist’s exuberant pride in himself, but in the early 1500s people flocked from far and wide to see the painting after it was put on public display. It may well have reflected Dürer’s hyperinflated sense of self as a master painter, as if he was on a par with God regarding the beauty of his artistic creations, but audiences seemed to agree with him. In fact, as a result of his work’s popularity, people became as obsessed with the man as they were with his paintings. Art historian James Hall notes that Dürer’s cascading curls were so famous that “after his death in 1528 his admirers exhumed his body to take casts of his face and hands, and cut lockets of hair.”
However, beyond these tales of early celebrity, John Berger proposes another, far more interesting reading of the artist’s self-obsessed body of work. He compares Dürer’s Christ-like image to an earlier self-portrait painted in 1498, in which Dürer looks no less regal but perhaps a little more anxious, like a young debutante entering society for the first time. Berger suggests there is “perhaps a slight over-emphasis on his being dressed up,” as if “the portrait half-confesses that Dürer is dressing up for a part, that he aspires to a new role.” Having recently travelled to Italy to see the latest trends within Venetian painting, and having heard the latest ideas shared by Italy’s art critics, Dürer no doubt “came to realise for the first time how independent-minded and socially honoured painters could be.”
Contrary to our modern interpretations of Dürer’s pride, Berger wonders if the young artist wasn’t so much a prima donna but instead the first to depict a new kind of self. He argues that modern viewers give too much credence to their “many complacent assumptions of continuity between his time and ours.” We should instead be humbler and acknowledge that understanding “Dürer historically is not the same thing as recognising his own experience.”
Berger argues that Dürer saw himself as a new kind of European man and was consistently fascinated by the cosmopolitan figure reflected back at him. Indeed, he was one of the very first examples of a “Renaissance Man”, newly aware of the potentials of his own will. “When he looked at himself in the mirror he was always fascinated by the possible selves he saw there”, Berger writes.
Though only beginning to emerge in the portrait from 1498, this is perhaps even more true of his Christ-like appearance in the self-portrait he painted two years later. Berger argues that it could not have been the painter’s intention to be blasphemous; he was a devout Roman Catholic, even after Martin Luther instigated the Reformation in 1517. This makes the painting aspirational rather than self-aggrandising. “The picture cannot be saying: ‘I see myself as Christ’”, Berger argues. “It must be saying: ‘I aspire through the suffering I know to the imitation of Christ.’” If Dürer was so self-obsessed, it was as a true narcissist. He hoped, more than anything, to be transformed.
Though one of the first, Dürer was far from the last artist to see himself in this way. His beautiful self-portrait is exemplary of a growing trend across Europe at that time, when artists were depicting themselves as notable members of society, rather than hired hands serving their rich and famous patrons. As a result, self-portraits took on an aura akin to contemporary headshots of famous actors and celebrities, and their grandiosity served a similar professional purpose as well. They were like all-in-one business cards or curriculum vitae, containing everything a curious patron might want to know about a person. First and foremost, they presented the viewer with both the artist and their skills, but they also occasionally advertised an individual’s social circle, as well as their personal interests and possessions.
In Italy, where self-portraits were especially popular, there developed a trend for artists to paint group portraits of themselves amongst various figures from high society – an early example of a professional portfolio, perhaps, or an antecedent to that Instagram staple, the group selfie, showing off your friends in their hottest outfits before you all hit the town. Less a depiction of a group identity, these paintings were made specifically to boost the social standing of the individual painter, who usually occupied a central panel whilst surrounded by studies of his famous friends. It was the beginning of a transitory period, where the social subject, as a member of a community, was transformed into a social self, with a person’s popularity and friendships being adopted as an individual virtue.
The uneasy or exaggerated forms through which these representations of self were manifest have never really gone away, but this is not a sign of their stability as aesthetic forms. Though they may have tried to adopt an ideal-I, the artists of the Renaissance did not eventually settle into stable identities. Ideal selves remained out of reach for the individuals concerned – although Dürer’s self-portraits were clearly adopted by others, coming to represent him in the popular imagination. For others, the gulf between self and self-portrait both narrowed and expanded. Allegorical self-portraits soon became popular, with artists inserting themselves into imagined scenes, but the psychological depth of such paintings provided further insight into an artist’s experience of themselves as an individual. Such an experience was not always positive. Soon enough, what began as sincere self-aggrandisement slipped into irony and irreverence, not to mention self-critique and self-deprecation.
Working almost a century after Dürer’s rise to fame, Caravaggio was easily the most infamous provocateur of the Italian Renaissance. His depictions of the self are evidence enough of this fact. Though he produced self-portraits in which he looks very handsome indeed, he was not partial to depicting himself as one of the rich and famous, like so many of his peers. All too familiar with the values and expectations of his patrons, particularly the Catholic Church, Caravaggio instead devised bold new ways to subvert them. These subversions did not go unnoticed. Unlike Dürer, his paintings were deemed sacrilegious acts, and his various controversies are well-documented.
These include hiring sex workers as models when painting commissions of the Virgin Mary and depicting the dirty soles of saints’ feet. But more interesting than his controversies are his self-portraits, in which he depicted himself in several surprising and even unflattering roles. Whereas Dürer hoped to be Christ reborn, Caravaggio saw himself as the devil incarnate. But his self-portraits were not demonstrations of a kind of pantomime showmanship, playing the villain for shock value and infamy; his unconventional selfies were often sincere and complex attempts at self-critique, even when dripping in irreverence.
In the very last years of his life, Caravaggio used his own likeness to paint the severed head of the giant Goliath, mouth agape and eyes bulging, with blood spurting from his ragged neck. In the original Bible tale, Goliath is a giant Philistine threatening the Israelites on the outskirts of Jerusalem. He is confident in his ability to squish all opponents who might try and challenge him, and so he goads the Israelites into sending forth a champion to duel him. A young shepherd, David, approaches the giant with a slingshot and some stones. To everyone’s surprise, David manages to knock Goliath unconscious with his ranged attack, before quickly chopping off his head. It is a tale today synonymous with upset wins, when unlikely underdogs bring down well-established opponents. Though it may not have held the same idiomatic associations as it does today, could we interpret Caravaggio’s depiction of David’s victory, with the painter casting himself as the dead giant, to be an expression of his own careerist insecurities? Caravaggio would die in 1610 and so the painting was one of his last works. Was he afraid some young new talent would knock him from his pedestal? Unfortunately, it seems that Caravaggio’s fears for his own head were far more literal. Indeed, many of his later works take beheadings as their subject matter, and each seems to express either a fear for his life or painterly pleas for mercy.
Caravaggio was in exile for much of his final decade. His reputation for fighting, insolence, and petty crime made him a target for both criminals and law enforcement alike. But his reputation was ruined utterly when, in 1606, he killed a man named Ranuccio Tommasoni. One story goes that Caravaggio and Tommasoni had bet on a game of tennis and disagreed on the outcome. Another version of the tale suggests that Caravaggio was jealous over Tommasoni’s relationship with Fillide Melandroni, a local sex worker who had modelled for Caravaggio on several occasions – most famously as Judith in yet another gory painting of an assassination, Judith Beheading Holofernes. Papers released by the Vatican in 2002 seem to confirm the latter tale. Whatever the true source of their disagreement, the pair decided to settle their differences by having a duel. Caravaggio won that duel and attempted to castrate his opponent as punishment. Tommasoni died from his injuries.
Caravaggio had not intended to kill his rival and the fallout from the botched duel was complex. The painter’s life was turned upside down; he went on the run, travelling the length and breadth of Italy, and even spending time in Sicily and Malta. Whilst in exile, he painted himself as Goliath, but this was not the only painting to predict the painter’s imminent demise. He also painted Salome with the Head of John the Baptist, a painting in which the severed head once again looks like Caravaggio himself; Salome also resembles his former mistress, Fillide. The painting was presented to Fra Alof de Wignacourt, the Grand Master of an order of Maltese knights, as a gesture of goodwill. Having fled to Malta, he was later driven off the island, either because news of his crimes had reached the Maltese noblemen, or because Caravaggio still couldn’t behave himself and once again ended up on the wrong side of the law. No doubt already exhausted and tired of life as a fugitive, in repeatedly offering up his head on a painted platter, Caravaggio longed for mercy. He didn’t get it.
Caravaggio’s self-depiction as Goliath is just one example of his work’s psychological and reflexive depth. Less graphic but no less self-destructive, he also painted himself as Bacchus, the Roman god of wine and fertility, as well as madness and the festival; otherwise known in Greek mythology as Dionysus. Rather than a jovial character, Caravaggio’s Young Sick Bacchus wears a queasy grimace and green skin. He is holding a bunch of grapes, as if ready to keep eating, but resembles a drunk at a party who has had one too many and should really start thinking about going home. Against his own better judgement, he is instead trying to keep up appearances. It is, in many respects, a subversion of Bacchus’s character. Whereas Narcissus may be associated with a kind of self-intoxication, the drunken Bacchus is instead a figure of social abandon. Those who follow him are freed from an otherwise suffocating self-consciousness. However, in Caravaggio’s hands, this Dionysian spirit is less clear cut. Inverting the narcissism of a self-portrait, Bacchus’s detachment from the self is nonetheless depicted as its own kind of sickness. Though care-free and hedonistic, a glutton for pleasure, Bacchus is as grotesque as any of Caravaggio’s other portraits of mythological monsters.
Each of these paintings seems to tell us something about Caravaggio’s sense of his own position in Renaissance society. Rather than exercising the braggadocio of many of his more well-to-do peers, to paint himself as a vanquished giant or a sickly hedonist suggests his lifestyle wasn’t all it was cracked up to be. Though none of these paintings can be labelled as “self-portraits” in a traditional sense, as allegorical paintings of the self they are arguably even more accurate depictions of Caravaggio’s self-understanding, in that they allude to an inner experience that the rest of the world was not privy to. Those elements that are most like or relevant to Caravaggio himself are hidden, obscured, or just symbolically alluded to. Though they may have been exploratory and reflexive for the man himself, as viewers of his paintings we are only made more aware of our distance from him – the mythologised painter with a bad reputation. It is as if, despite the fact he is wearing a series of masks, Caravaggio’s portrayals of others tell us something more compelling and less vainglorious about the person underneath than Dürer’s self-portraits ever could.
Caravaggio is also said to have painted a now-classic depiction of Narcissus. Some scholars dispute its authorship; it was first attributed to the painter as recently as the twentieth century. Most likely painted in the early 1600s, even if we were certain that this painting was produced by the Renaissance’s chief connoisseur of causes célèbres, it would remain unclear as to whether he used himself as a model or someone else. Regardless of its true authorship, as an unusual painting of Narcissus it still tells us a great deal about the time in which it was made.
The painting is eerily minimal; a striking example of the tenebroso style. Narcissus is enveloped in shadow and darkness, and we cannot see the world beyond him, only the kneeling figure and his gloomy reflection. Even the riverbank on which he sits seems dead and barren, as if the solitary hunter has become marooned on some terminal beach. If narcissism is an imbalance in the relationship between self and world, Caravaggio’s Narcissus has lost touch with the world altogether. Compositionally, his is a form totally in orbit of itself.
Though Narcissus may be that quintessentially (if extremely) reflexive subject today, for Caravaggio this reflexivity may have had another purpose. As with his paintings of Goliath and Bacchus, with so little of the world around him on display it is the act of looking itself that is the focus on the painting, making it less a comment on the relationship between self and world and more an evocation of that divide between artist and audience.
This may have something to do with how Italian artists and writers understood the myth of Narcissus during the Renaissance. For Leon Battista Alberti, for instance – an influential humanist and notable friend of Dürer’s, who he met on a trip to lecture in Nuremberg – Narcissus was “the inventor of painting”. On the one hand, this may be a reference to the average artist’s self-concern; the thrill of having one’s work admired and loved is, of course, a euphoric and narcissistic high. But there is another interpretation here. It is as if, for Alberti, Narcissus’ fury at his own impotence, his inability to capture and possess the reflection that has so captivated him, a reflection of his own nature no less, is an allegorical retelling of the experience that first drove the human species to paint in the first place. “As the painting is in fact the flower of all the arts, thus the whole tale of Narcissus perfectly adapts to the topic itself”, he argues. Narcissus’ metamorphosis is, in this sense, the primal scene of art history. After all, what is it to paint, Alberti asks, “if not to catch with art that surface of the spring?” The emergence of culture is Narcissus’ metamorphosis in reverse. As the flower is transformed into a transcendental object that we cannot know or possess, we attempt to remake it by our own hand.
This reading presupposes the philosophy of Immanuel Kant, perhaps the most famous critic of Cartesianism in the centuries that followed the Renaissance. In his Critique of Pure Reason, first published a century and a half after Discourse on Method, Kant refutes Descartes’ “material idealism”, which he defines as “the theory that declares the existence of objects in space outside us either to be merely doubtful and unprovable, or to be false and impossible.” For Kant, there is a clear relationship between subject and object, and this is true enough of paintings themselves. Objects, Kant observes, affect us. We sense them, and these senses intuitively give rise to understanding. But to understand something intuitively is not the same as being in possession of some rigorized conceptualisation of human behaviour. There is an a posteriori understanding that comes directly from experience and not from reason or theoretical deduction. Nevertheless, it is an a priori understanding that we should be striving for – an understanding that arises from scientific reason and analysis, independent of our personal experiences. Kant argues that it is only through experiential understanding that “objects are thought, and from it arise concepts.”
In the present context, Kant’s so-called “transcendental aesthetic” suggests that it was our thinking about the self-portrait as an object that eventually gives rise to the concept of “the self” – and the art-historical timeline certainly supports this reading. Contrary to Descartes’ self-mythologising account that the concept of the self was innate to his own mind, and therefore conjured without any influential from the outside world, Kant observes that, whilst we cannot fully know things in themselves – that is, beyond their perception by the human senses – they can nonetheless illicit responses in us that tell us about the world in which we live. Self-portraits, then, as expressions of a posteriori experience, provide the foundation on which to build an a priori account of “the self”.
Intriguingly, this renders Caravaggio’s painting of Narcissus less a depiction of a reflexive subject than a reflexive object in its own right – a painting of the birth of painting. For art historian Susanna Berger, this makes Caravaggio’s Narcissus a “meta-image”. She suggests that, during the Renaissance, “such self-aware paintings could … thematize the potential fictiveness of visual experience” for the viewer, in the way that their content and structure echo the act of painting itself or, additionally, the very act of looking at a painting. “In visualizing acts of observation”, Berger argues that meta-images “turned gallery visitors into representations on display, an effect that would have made the spectators’ identification with Narcissus even closer.” This is to say that paintings like Caravaggio’s Narcissus not only dramatize an artist’s own self-consciousness but raise that same consciousness in the viewer as well. Caravaggio may have been aware of this. Just as he lampooned the habits and values of his patrons on various other occasions, perhaps Narcissus was another knowing nod to our growing obsession with images. Just as the Catholic church betrayed its own narcissism in commissioning grand representations of its own mythology, so too did other patrons of the arts get off on the very act of looking at those objects that they owned.
In his final years, Caravaggio played up to this narcissism explicitly, hoping that, in painting his head on a platter and sending it to someone who could influence his future, he could sate the desires of those who wanted his actual head on a spike. The seeds that would eventually bloom into Locke’s liberalism begin to sprout – to “own” a Caravaggio was recognition from the artist that his noble patrons also “owned” the man himself. Whereas Dürer painted his own power, Caravaggio hoped to paint and flatter the power of others, including their power over him, forcing the viewer to reckon with their own cultural impact and influence. John Berger recognised this same tendency in Caravaggio’s oeuvre. If this Roman rebel was so arrested by self-hatred, routinely depicting his own precarity, perhaps that is because he had known the effects of living under this kind of power his whole life. As Berger writes, he was “the first painter of life as experienced by the popolaccio, the people of the back-streets, les sans-culottes, the lumpenproletariat, the lower orders, those of the lower depths, the underworld.” Through first-hand experience, he could avoid “presenting scenes” and instead depict “seeing itself”, as through the eyes of the lower classes. “He does not depict the underworld for others: his vision is one that he shares with it.” But this does not result in a new era of artistic sympathy and representation. Just as King Oedipus found out the hard way, the Delphic motto to “know thyself” does not automatically equate with an ability to like thyself.
The rise of the self-portrait was fraught, in this regard. Though we have repeatedly emphasised the liberal worldview, and the self-portrait as the artistic depiction of our experience of ourselves as individuals, as a form of painterly encouragement to “know thyself”, our place in a wider social order is never far away. As such, though they lived a century apart, with very different styles and concerns, both Caravaggio and Dürer were two subjects newly aware of their power and the power of others, and how that power could be wielded, from within and without.
Discourse on Method is the most common abbreviation of the full and unwieldly title of Descartes’ work, which is Discourse on the Method of Rightly Conducting One’s Reason and of Seeking Truth in the Sciences.
 Rene Descartes, Discourse on Method and Mediations on First Philosophy, trans. Donald A. Cress. Cambridge: Hackett Publishing Company, 1993, 18.
 At the time of writing, the state of Israel’s continued occupation of Palestine and its ethnic cleansing of Palestinian neighbourhoods in East Jerusalem has recently led to a short but disastrous conflict, which led to the death of a dozen Israelis and over 200 Palestinians. The state of Israel is a perfect example of how liberal politics can lead to atrocities in the twenty-first century. For Zionists, the Israeli occupation of land is explicitly tied to their theological and ontological ideals. To be Jewish, they suggest, is to have a home in Israel. As a result, to challenge Israel’s “right to exist” as a nation-state is, for many, to challenge the right of Jews to exist as a people. National sovereignty is equated with individual sovereignty; politics and ontology are fatally entwined; ideology is hidden under a flawed understanding of the very basis of human consciousness and reason. Many critics of Zionism argue that this is a false equivalence – not a truth but a liberal ideal – and it is very possible to be Jewish without violently claiming ownership of contested land and property. It is telling that it took until 2021 for this view to go mainstream.
 Following the publication of his Ninety-Five Theses in 1517, Martin Luther successfully orchestrated a split from the Roman Catholic Church, which he criticised for its political overreach and abuses of power. This included the church’s claims to absolve the sins of wealthy donors, in a kind of “cash for absolution” deal that Luther considered to be fundamentally corrupt and sacrilegious. Luther’s act of “protest” gave its name to the form of Christianity that developed in his wake: Protestantism. It also reasserted the sanctity of the individual in matters of faith. Arguing that contrition before Christ could not be bought and adjudicated by an institution, instead coming from within, Luther asserted that everyone is responsible for their own repentance on an individual basis. (We might expect this move to be attractive to Dürer, and research suggests he was politically sympathetic to his ideas, even wishing to draw Luther at one point, but it seems that he remained loyal to the Catholic church regardless.) Though useful when dealing with issues of corruption, this central Protestant sentiment was diluted and spread amongst the lower classes as well, providing the foundation for capitalist voluntarism and further allowing institutions of all forms, like employers, to relinquish responsibility for their workers.
 Caravaggio’s true cause of death has never been confirmed. Some believe he was assassinated by relatives of Tommasoni, or one of the Knights of Malta. This was certainly the dominant rumour at the time. But the painter did not die right away, and so it is thought that Caravaggio succumbed to sepsis after a wound sustained in a brawl became infected. Others believe his death was due to some other disease, like malaria or brucellosis. Recent archaeological investigations, following the examination of human remains believed to be Caravaggio’s, suggest that both his death and is erratic behaviour in life could be explained by lead poisoning, caused by lead salts commonly used in paint at that time. No matter how Caravaggio met his match, he died on the run, never having been forgiven for his crimes.
 Leon Battista Alberti, On Painting, trans. Rocco Sinisgalli. Cambridge: Cambridge University Press, 2011, 46.
 Immanuel Kant, Critique of Pure Reason, trans. Werner S. Pluhar. Indianapolis and Cambridge: Hackett Publishing Company, 1996, 288-289.