The Rotten Western (Part 2)

Spoilers for The Last of Us Part 2 from the very start. You have been warned.

← Part One

After the shock of Joel’s horrific death subsides, Ellie and Dina plan their trip to Seattle, where they hope to avenge Joel by hunting down the members of the Washington Liberation Front who are responsible for his demise. What Joel did to deserve such a death is, for the moment, unclear. “Joel pissed off a lot of people,” Ellie admits.

Before heading out, they visit Joel’s house to take on final look at the life they knew — a briefly sheltered life; a brief life with Joel. Inside what we find is not so much a house as a museum piece. It is unclear how long Joel has been gone — days; maybe a week or two? — but already his home feels like a living memorial. However, this home is very different to the homes we’ve so far seen Joel inhabit… For starters, the Old West nostalgia in Joel’s Jackson house is surprising. Whilst, at first glance, it seems to suit an idealised version of the man we’ve come to know, as I lingered amongst its decorations and detritus I also found it jarring with reality.

It was a moment that took me back to the start of the first game. As Ellie staggers around Joel’s now-vacant house, grief-stricken, I wanted to replay the first game’s prologue, in which Joel’s daughter Sarah staggers around their home half-asleep looking for her father.

Sarah and Joel’s house is recognisably modern. It’s messy too; the banal neglect — no doubt the product of an entwined teenage laziness and single-parent fatigue — is pervasive. It is also strangely haunted by a violence to come, in which we can already predict the surreality of a house in ruins, its present lived-in state foreshadowing an inevitable, soon-to-be looted state-to-come. But the house is lived in, at least. Joel’s house in Jackson feels like it has been laid out all too neatly, like it will be the future home of a Joel waxwork. It is sterile, and haunted by an unpredictable past rather than an all too predictable future.

We could argue that, post-outbreak, the entire world is haunted by the past in this way but the eeriness of much of The Last of Us‘s environments comes from the fact that these pasts are forgotten. As recognisable as the suburbs and cityscapes are to us as players, we become accustomed to seeing them as ancient ruins — that is, we see them through the eyes of the game’s protagonists. The difference between the two is, perhaps, one of grief. Whilst we might grieve the sight of a burned down house in our present, as the sight of it invades our capacity for empathy uninvited, we do not grieve the remnants of ancient civilsations.

The tension between the past, present and future in this regard has been the defining enviro-temporal tension of the Gothic for centuries, but this only makes the design of Joel’s house more surreal. It slips somewhere between the two — between the Gothic and the grief-stricken. It’s preservation jars with a narrative wherein life so often ends without legacy.

Most interesting to me, in this regard, are the paintings on the walls of Joel’s two houses. In the first game, Sarah’s room is peppered with posters for bands and films, for instance. As you head into the corridor and, eventually, to her father’s bedroom — it’s the middle of the night and he is, conspicuously, not at home — you see that the walls are decorated with various family photographs and natural vistas.

Much has been said about the snowy landscape “easter eggs” above Joel’s bed and set as his phone’s background, both foreshadowing an environment later on in the game where you first get to play as Ellie, but beyond this it is intriguing to see the majesty of nature devoid of any presence of the human.

On another wall in Joel’s room, for instance, there is a painting of horses running free. It is that stereotypical image of American natural beauty but it also foreshadows the stampede of infected and uninfected that the player is about to be caught up in. Elsewhere, there are pictures of ducks about to take flight, similarly evoking a natural tranquillity whilst also being a sight you might expect to see on the end of a gun. Humans are nonetheless absent in all instances.

In this sense, the decorations are more reminiscent of a dentist’s waiting room or my grandma’s house rather than a modern family home. It inadvertently emphasises some of the critiques of the first game — the player is left feeling more like an observer than an actual participant in the world around them — but, in The Last of Us Part 2, this changes; there are many figures in the landscapes that adorn Joel’s walls, as if the decoration now reflects the forced changes in play style. Actions have consequences. This is no longer (just) about an indifferent nature in-itself. This is a game with a Promethean edge, imploring the player to interrupt the world, even when the odds are not in their favour.

In the game’s next act, this point is made clear almost immediately. Whilst this is true within the context of the game’s new mechanics most explicitly, it is also evidenced by Ellie and Dina’s interactions with their environment. Take, for instance, the musical encounter that has already proved to be iconic in representing the game’s intensified emphasis on player agency and character development.

As Ellie and Dina trawl through downtown Seattle, they chance upon a music shop. Vinyl records fill the bins ready to be flicked through but, perhaps to our surprise, they are not some by-gone novelty for the pair; in Jackson, it is shown that they have the capacity to listen to music from the old world and they also watch old DVDs. Instead, confronted with this snapshot of an old way of life, Ellie wonders if there are people out there in the world somewhere who are making new movies. She writes new songs, she says, as well as listening to old ones, so surely there are people out there lucky enough to have the resources and know-how to make new movies too.

Though it may seem like a somewhat naive question, Ellie’s reasons for asking it are quite convincing. In a world so disconnected from itself, you can never account for how good or how bad other parts of the world might have it, and you also can’t account for what kind of cultural artefacts might remain a part of their social fabric. This is to say that, in its abject primitivism, the Fermi paradox is made wholly terrestrial.

As I play through the game, I find myself thinking about this a lot. Joel’s nostalgic nature isn’t something I want to criticise. In fact, it is all too relatable. In his role as father figure, he wants to inspire Ellie with his knowledge and expertise, showing her things about the old world that she can take with her into the new. Whilst Ellie’s excitement and curiosity in this regard is endlessly endearing, Joel’s own melancholy never quite fades into the background. And it is an understandable melancholy too. If I was able to watch old films or listen to old records depicting a world catastrophically destroyed by a zombifying pathogen, I think the cognitive dissonance would soon start to take its toll. For the younger characters in The Last of Us Part 2, however, this disconnect is taken to be a given. They don’t focus much on what has been lost but always push forwards, considering what they can do next. They seem inspired by the old world but only because it shows them the kind of cultural production possible in the new one they hope to build.

Joel’s is less focused on the future. Whilst this might seem like a cynical appraisal of his character, one look around his house makes it quite clear that, if Joel Miller had a film camera in post-apocalyptia, he’d be making Westerns. Whereas Ellie’s inner songbook contains the works of A-Ha and Pearl Jam; Joel’s starts to feel like a world of reactionary American primitivism — what Leslie Fielder once termed a “higher masculine sentimentality” — where a rugged music like the blues might suddenly makes an ahistoric comeback. After all, there are cowboys everywhere. Joel has even taken up carving them ornately into wood. But this romantic figure of man and horse — seemingly representative of a fraught if nonetheless very human relationship with nature — is far more reminiscent of the life Joel has acquired for himself after the apocalypse rather than being representative of anything that came before it.

In many ways, this is precisely the function of the Western in popular culture — a way of laundering the present through the romanticism of the past. As Sam Peckinpah, director of The Wild Bunch (among other Westerns), once said: “The Western is the universal frame within which it’s possible to comment on today.” However, in a game like The Last of Us Part 2, this sort of process is most commonly inverted — we launder the present through the horror of the future. As such, it is strange to see the Western’s original polarity contained with the game in miniature; it renders it strangely cyclonic, with overlapping feedback loops, giving rise to a kind of temporal horseshoe of cowboy metaphysics that immediately renders time out of joint.

This strange templexity is only made more apparent by the abundant references and archetypes taken directly from many a classic Wester. For example, walking around dead Joel’s house, I found myself thinking about his previous adventures and general misanthropy — at least in the first game. As I try to picture him as some archetypal cowboy, he starts to resemble Uncle Ethan in Henry Ford’s The Searchers — the coldhearted horse-riding rifleman.

The Joel we met in the first game — before Ellie eventually thawed him out — was similarly violent and cold, traversing the plains of former downtown financial districts, overshadowed by wrecked skyscrapers not unlike the geological towers of Monument Valley. However, this hardly seems like an existence Joel would want to romanticise after the fact, in the way he has done in Jackson.

But even in a film as revered as The Searchers, the cowboy’s life is deeply disturbing. Ethan the anti-hero, played by John Wayne, isn’t just cold; he’s a horrible and vindictive racist — surely even by the standards of 1956 (and this is apparent from the opening scene). The horror that often greets his actions, painted on the faces of his dysfunctional and god-fearing posse, is tellingly triggered most often by the strange disregard Ethan has for the living and the dead. He mutilates corpses out of spite, for instance; he also has no sympathy for the Indians, allowing them no respite so that they might deal with their dead and wounded after a shootout. This disturbs his fellow travellers even more than the racialised threat of the Red Man. (These attitudes are less scandalous when expressed following a zombie apocalypse, when the Indians are substituted by undead hoards, but we might note that this only normalises Joel’s familiar contempt as dead.)

Despite all of this, The Searchers, in the popular imagination at least, continues to be upheld as this classic and deeply romanticised representation of the Old West. It is as if the sheer majesty of its location quite literally overshadows the deeds depicted on screen.

Joel seems to romanticise his own life in much the same way. The majesty of the classic Western becomes a way for him to look beyond the violence of his life and revel in nature. It is an understandable compartmentalisation, considering the plant-horror of the cordyceptic pathogen, but still, the extent to which his house starts to feel like a Searchers shrine, with its paintings of gun-toting cowboys in Monument Valley, seems oddly out of place.

Why does Joel retain such a firm grasp on the Old West? Is this just Joel romanticising his own trauma in order to better deal with it? Is this him compartmentalising a life he never knew in the form of old genre tropes many of those younger than him may have never seen? Is a fall back into the Texan stereotype really all it takes to scrub the horror of his life away?

Perhaps this mournful dissonance is unescapable for Joel. After all, he seems to recognise, implicitly, that he lives in a new Rotten West, but the only way he can find hope for himself is by going backwards. Ellie and Dina, retaining a very different (post-)cultural foundation, find the West taking on a very different form — theirs is a postmodern Western, no doubt, but it is far more hauntological in that sense; that is, it is a kind of “good PoMo”, as Alex Williams once put it, compared to Joel’s “bad” form of reactionary pastiche.

I think this is because, whilst Joel has a world to mourn, it is a world that decisively dies with him. Most of what Ellie and Dina know of life is violent political factionalism and the equally violently indifference of nature. Whilst this might resemble the Wild West absolutely, they don’t seem to know that. It’s not an echo of the past for them; just the present that they know. As such, they’re still mournful, but their alienation seems to come from the fact that they don’t actually know what it is they’re supposed to be mourning. They live a hauntological existence precisely because they are mourning their own stuckness.

I’d argue that this position echoes my own (revitalised) version of hauntology quite acutely, but Alex Williams’ old critique is still worth bearing in mind. For Williams, hauntology is always representative of “a cowardly move, lusting after utopias that never were, or which are now unreachable, a retreat into childhood/youth, just as trapped in the endless re-iterative mechanistics of the postmodern as the lowest form of retroism, merely in a hyper-self-aware form.” Because of this, hauntology “cedes too much ground to what it attempts to oppose, because of an a priori assumption: that there is nothing else (at this moment in time at least), that nothing else is possible, and as such we [must] make the best of this (and that the best we can do is to hint at the possible which remains forever out of reach — with all the pseudo-messianic dimensions this involves).”

What we see in Joel’s house is precisely a “making the best of it”. The scenes represented on his walls are representations of the life he already lives, but exorcised of all horror and instead jettisoned to a few hundred years in the past. This temporal displacement is precisely an aesthetic instantiation of the a priori Williams is talking about. There is nothing else at this moment in time at least; ergo, all that is really possible is to return to a past moment, and a past moment that Joel himself has not experienced. It is a theoretical past rather than an observed one; the very definition of the Western as an ideological a priori.

So, what of the girls? Williams’ nod to Badiou in his conclusion is a factor I think most people interested in hauntology and accelerationism have forgotten. For Williams, Badiou’s “analysis of the emergence of the new” — recently discussed — “would entail a more strategic examination of precisely where the pop-musical evental sites and historical situations exist within our current time: those regions which appear, from the in-situational point of view, to be marginal, and properly undecideable.”

This is perhaps where Ellie and Dina lie. Whereas Joel, no matter how loveable, inhabits the reactionary misanthropy of a classic Western like The Searchers, Ellie and Dina personify a more revolutionary kind of homesteader, given the fact that they do not see themselves as some sort of iteration of the past. They respond with vengeance but because they are determined to pass through their new world of grief and transform it into a world where the same thing cannot happen again.

It is an intriguing form of the categorical imperative. They act upon the world in such a way as to punish those who live amongst them and think they can act with impunity. But they do so without much consideration for the now-normalised zombie apocalypse. This is, in itself, an intriguing gulf also present in many a genre film. The characters in any Western exist on a knife edge, where the indifference of the desert and the indifference of their fellow human beings produce quite distinct (but also oddly entangled) responses. In the Rotten Western, this already fine line becomes impossibly blurred. Nature and society are no longer false dialectical opposites, as they have been since the Enlightenment — or, perhaps, it is precisely that, but the falseness of this relation now takes precedent, transforming nature/society into a kind of corpse bride, with each mirroring the other and with each causing the other to rot.

It is a gross (but also nihilistic and realist) bastardisation of the relationship that dominates Joel’s house. Whereas he sees the best in this entanglement, represented by the image of a cowboy and his bucking broncho, in a cyclonic relationship that surfs the tension between natural rebellion and societal respect, the flatline construct of body alive and body dead is perhaps a far more honest appraisal of their new reality.

The figure of the survivor on horseback is an apparition; the reality is two humans, survivor and undead, in a never-ending tussle.

The Rotten Western (Part 1)

Below are some preliminary thoughts on The Last of Us Part 2 that I’d like to add to as I keep going with my current first play-through of what is already an incredible game. It should go without saying that this post comes with a big spoiler warning: come back later if you haven’t played it yet.

This post is also part of an ongoing project I’ve mentioned a few times in recent years and which I’m (still) very slowly building behind the scenes: a book I’m calling Frontier Psychiatry. More on that soon.

Every era of modernity has had its own Western. The genre is a cultural weathervane for the United States (in particular but not exclusively) to reflect on, as well as assigning it a trajectory. By morphing and responding to each new phase of the USA’s history, the Western – although modelled on an ideological (and, therefore, also idealised) form of the past – suggests a state of mind in the present and what it sees in its own future.

The Sheriff, in this sense, is a great American imago. In many a classic Western, it is the sheriff or lawmaker who fights off the Red Man, the mad dogs, the robbers and rapists. And yet, he is also often an anti-hero – embittered, traumatized, perhaps a drunk. Indeed, as the genre has developed, along with America’s sense of itself, so too have the archetypes at its heart – and these developments have not always been positive. For instance, the frequently explored subgenre of the Acid Western paints a picture of the Wild West that acutely reflects the anxieties of the 1960s and 1970s. Most importantly, despite the horror of the environment, it is a subgenre that imagines the West as a mythical land that still retains a psychedelic function – that is, it retains its imaginative function as a land on which new (non-capitalist) worlds could manifest.

It is becoming ever clearer that our stories of a post-apocalyptic zombie-infested United States describe a new West for today – a putrescent West, rotting from within. The TV adaptation of The Walking Dead epitomised this new kind of Rotten Western with a distinct lack of subtlety. The show’s sheriff protagonist, Rick Grimes, definedthe show as a piece of transitional media in this regard. It walks a midway point between states of mind: between a nostalgia for the frontier and a fear of it, with the zombie hoards functioning a little too well as a racialised native other, at home in death.

Whilst this was an interesting tension in 2010, a decade later it is clear that the show exists in a very different world, in which the show’s internal drive to make a post-apocalyptic America great again takes on a far less melancholic momentum. With this in mind, the (apparent) death of Rick Grimes – the downfall of the great white imago – was long overdue and overwrought. By the time it happened, the show’s audience had begged so long for something new that the change went unnoticed by those who had stopped watching many seasons ago, but it was also unsurprising. For a long time, it had be necessary for the show to put its money where its mouth is.

No character can be afforded plot armour – that was The Walking Dead’s central traumatic assurance to its audience. This often led to grief being used as a plot device, often profoundly, but this rule seemingly began to test the writers’ own resolve as their audience staggered onwards in a brutalised daze. If the show was to stay true to its word, it had to refresh itself frequently. In a way, it was like the show’s narrative could do what much of its cast could not – shedding its skin, healing, becoming-new rather than becoming-rot. For many, it failed in that regard, and Rick Grimes’ lengthy rule as the only sheriff in town was the show’s Achilles’ heel. The sheriff was long best his best when he finally got the axe, both within the narrative of the show and within culture at large.

What has struck me most, in my playthrough (so far) of The Last of Us Part II, is that this franchise seems confident that it will not make the same mistake as its televisual cousin. Not only have characters been refreshed – I found that Ellie’s big nose, no doubt affixed to her face to settle that fall out with Ellen Page, took some getting used to – but, most controversially, the central character of the first game, Joel Miller, is brutally murdered at the end of the first act. There has been a lot of consternation online about this, and a lot of outright anger, but all I see in these responses is grief, of the sort that any viewer of The Walking Dead should be used to. In a zombie apocalypse, there is no plot armour. Joel, in the first game, demonstrated this in reverse. It was his daughter who died at the very start of that game’s first act, but in the final act Joel saves Ellie from a similar fate – murder, essentially, at the hands of the “state” (loosely defined as a pervasive militarised body) or, perhaps, for the sake of an apparent greater good. (A contentious connection to make between the two characters and one I don’t want to unpack here for the sake of brevity.)

The second game takes this brutality to a whole new level, Indeed, violence is one of the game’s primary USPs. This is a really fucking brutal game. And yet, the fact that the emotional impact of the game matches up to its gory spectacle is commendable. There are enough games out there that are all gore and no heart.

This sort of brutality is one of the defining characteristics of the Rotten Western – and, indeed, the Western more generally. In fact, what we are seeing with The Last of Us as a franchise is that it seems to be building towards some sort of trilogy, like the Spaghetti Westerns – those “operas of violence” – of the Seventies.

In the first game, you have an archetypal story of deliverance, specifically for Joel. It was the big Texan’s reluctant task to (quite literally) deliver an immune Ellie to a militia group, the Fireflies, so that they can develop a cure. But underneath it all, Joel also has to set himself free from the trauma of his daughter’s death at the start of the outbreak which has, at first, made him brutally cold to the world around him. It is Ellie who eventually thaws him out. [1]

In The Last of Us Part II, the tables have turned. The wintery tundra in which the first act of the game is spent tells us one thing only: Joel and Ellie’s hearts may have warmed, but the world is still cold to them – and to us. A fire still burns, however, and it reignites deliverance, turning it into vengeance. [2]

I think it is important that this act of revenge comes following the violent destruction of Joel as the sheriff-imago. In fact, it couldn’t realistically be anyone else. The Walking Dead‘s over-reliance on traumatised women and the horrific demise of the Asian-American Glenn, though still traumatic, felt like familiar instances of American dispensability for too many. It is a superficial twist on the black guy always dies first, swapped out for the minority always dies worst. This is to say that, in The Walking Dead, more abstractly but no less predictably, the less archetypal characters always had less plot armour than the likes of Rick Grimes.

Many have complained that the priorities of The Last of Us Part 2 betray a violent wokeness, through which the teenage lesbian outlives the patriarch, but it seems to me like this is the world that The Walking Dead didn’t have the nerve to inaugurate until its audience was passed the point of caring: a world in which the unseen and more nomadic subjectivities embedded within American life fair better than those we are more accustomed to cheer on.

Think again of the Chief in One Flew Over the Cuckoo’s Nest. We have long wrestled with the fact that there is a future that may not be for-us. We might think of that as a world without the human race, or we might think of it as a world without the hegemonic subject of capitalism.

This is the first lesson taught by the Rotten Western.

[1] Western’s often play on deliverance like this, particularly in their video game variety. Fallout: New Vegas anyone?

[2] In fact, this is one of my favourite things about the haven of Jackson – the little frontier town out in the mountains of Washington where Ellie, Joel and co. have been holed up since the events of the first game. Whenever it is mentioned, I can’t help but think of June and Johnny Carter singing about how they got married in a fever. Joel and Ellie may not be “married”, but the threat of the characteristic body burn-out of infection certainly cemented their bond.

The Games Industry: Accelerationism and the Hauntological in Microcosm

I’m currently doing a load of research into accelerationism — when am I not — for a new thing. I’ve been digging far back into the blogosphere to try and accurately trace its development from its 2007 beginnings to the present, but without all the distracting retconning of various philosophers who have at one time or other expressed an accelerationist opinion. (I found a very early Benjamin Noys post where he offers a few examples of accelerationist positions and one was a quote from Roland Barthes so I’m left feeling like just about anyone could be a Noysian accelerationist at this point.)

What I’m currently intrigued by is how the accelerationist split first emerged. (Alex Williams’ (at least I think it’s his) old blog is proving to be fascinating reading right now — straight-up red-hot Landianism over there — no surprises he’s since deleted most of it.) In fact, its split is arguably its founding gesture — an appropriate Big Bang moment for the first blogosphere when the first atom split and birthed a whole network of weird social media enclaves that just keep splitting.

Most people should know by now that “Accelerationism” as a term related to political philosophy was coined by Noys but it was arguably Mark Fisher and Alex Williams who made it what it is. (And, credit where due, Steven Shaviro’s blog was arguably the blog where the initial discussion started.) I’ve mentioned this a few times on here and on Twitter but the initial developments came from  Noys writing his 2010 book The Persistence of the Negative in which he critiques Continental philosophy’s obsession with affirming a certain kind of negativity. Fisher, in deftly trollish fashion, then affirmed Noys’ negative critique. In hindsight, this may have been a mistake on Fisher’s part but, for better or for worse, the name stuck and everyone has been confusing Noys’ and Fisher’s versions ever since.

It seems to me — although I’m still untangling this — that Fisher did this to demonstrate that Noys’ position as being somehow above this entanglement of negations and affirmations was a fallacy. In late capitalist society, we affirm negations and negate affirmations every day. The problem is that this process is far from the vaguely similar process first described in Marx’s dialectical materialism. This is to say that, in the 21st century, the dialectic of capitalism’s positives and negatives has become wholly impotent. This was the discussion within the blogosphere. It was not simply about how all the Conties affirm the negative but about how the negative itself was and remains in crisis.

So why not just be positive? Fisher’s argument was that that is what capitalism wants. It wants positivity all day every day. In this sense, the negative takes on a new potency but it has lost its effective charge. The question was, how can the negative produce the new? Accelerationism, in Noys’ hands, as that byword for everything “bad” about capitalism was the perfect sandbox to try this out in. Can we affirm the negatives of capitalism to produce the new?

It wasn’t as simple as that though, because nothing ever is. Accelerationism was also picked up by the blogosphere because it had obvious implications for the various and already well-established discussions around hauntology.

The relationship between the two is quite interesting, I think, and it is also far more nuanced than the usual assumption of accelerationism is fast and hauntology is slow. As Fisher noted in one post, this is not a philosophy of mind-numbing tautologies where what is negative is bad because it is negative and what is positive is good because it is positive. In fact, what seems to really galvanise discussions around accelerationism is that it is seen as the positive cultural charge to hauntology’s negative charge. Taken together, each with their own internal positives and negatives, they describe a strange tension within the 21st century.

The full argument I have about this might get hashed out somewhere else in more detail but I thought of an illustrative example of this relation that is culturally still prevalent (if not more prevalent) over a decade later but which doesn’t fit into what I’m working on: the games industry.

Accelerationism, as hauntology’s hyperactive cousin, was seen by Fisher and others as an analysis of the ever-increasing speed of technological progression under capitalism and how this was affecting human cultural production and the production of subjectivity. These issues are all still pertinent today. In fact, they can arguably be seen most readily in the microcosm of the games industry.

There, technological hardware is being improved at an astounding rate, with new devices, consoles and ways to play appearing with an increased frequency, and yet it is also an industry currently infatuated with remakes of classic games.

Why is this?

In some ways, the reason is practical. The technological innovations far outpace cultural development so that those foundational cultural experiences become lost as the hardware improves. Because we have memories longer than the rapid cycle of a “console generation”, we don’t just desire the new all the time. Sometimes we want the comfort of something we know. So what do you do if you want to play your old games?

There are some obvious answers. People might still own their old consoles, for example, but playing them on modern TVs can be a nightmare. (I, for instance, still lug my N64 with me wherever I go but it is increasingly temperamental.) Do I need to keep time capsules of all my old home entertainment technology if I want to enjoy something? This level of fetishism is commonplace, with people preserving old setups like vinyl nerds, but it’s hardly practical. There are other workarounds and emulators, of course, but the industry itself seems like it is only just coming to appreciate its tandem responsibilities — not only pushing out new products to feed the desire for the new and improved but also its responsibility to archive and retain access to past experiences that are in danger of being left behind and lost to the casual player who doesn’t sideline as an amateur games historian.

The main reason why this is an important consideration is that it is arguably one not shared by any other medium. Although they do get remade with a depressing frequency, a film doesn’t need to be entirely remade to be enjoyed easily in the same way that a game does. For games, it is a question of accessibility as much as aesthetics. This is to say that it is not always just a money grab but a way to celebrate the existence of something technologically maligned and also remind aging gamers of their foundational gaming experiences that they might want to enjoy for a lot longer than the rapidity of technological development may allow. Still, speed is a factor here. We’re not talking about experiences from decades ago. One decade might be all it takes for the remake treatment to become feasible. This timescale might shrink in future if nothing changes.

Here’s the problem of capitalist speed and cultural drag in a nutshell. The quick fix of just remaking old titles and making them shiny again is one way to do it but it doesn’t always solve the practical problem.

There is a further side effect from this, however. I wonder, considering how precarious gaming culture is, with technological progression and cultural instability leading to what we have at present — a frenzied stasis — isn’t it also this precarity that has led to a largely reactionary culture within the gaming community? One that salivates over superficial progression (graphics!) whilst hating real change? Is this not the very same issue that we see everywhere in society, albeit on a micro scale? That is to say, isn’t it precisely this capitalist acceleration, independent of human culture, which only causes it to drag, that leads not to a frustrated capitalism but to an increasingly reactionary subjectivity? Isn’t the fact that gamers are often such sensitive small-c conservatives a result of a sort of cultural-subcultural negative feedback loop? Stasis becomes a demand left oddly unfulfilled because capitalism cannot help but speed ahead of the lifespan of our desires.

“Well done, Xeno”, I hear you say. “You’ve demonstrated an obvious point about late capitalism using a really annoying example.” But part of me also feels like, if gamers could see themselves as the microcosm of neoliberalism that they are, maybe they’d be less sensitive about incompetence in their industry and more sensitive about how that incompetence mirrors the wider world around them.

Biden is Bethesda, you guys. Will you think a bit more about politics now?

We Have the Unconscious We Deserve: Notes on Resident Evil and the 21st Century’s Machinic Unconscious

It’s funny thinking back to how we used to play video games as kids. When I first started playing games, progression wasn’t really the point. Games — all games, irrespective of their design or style — were what you made of them.

Before the gaming market became overrun by the open-world “sandbox” genre, that’s precisely how I’d play even the most linear of titles: I’d complete a level and clear out all the bad guys, then I’d just hang around for a bit, role-playing, running about and getting to know the level’s layout, exploring every nook and cranny, and making up my own additional narratives whilst doing so. (I’d be curious to know if the often “mature” sandbox genre was not directly inspired by underage players in the way, playing games in ways that undermined developer intentions.)

I remember doing this very explicitly with the Spyro the Dragon series. That was the main thing I loved about those games. When playing the recent remakes, I was struck by how small most of the levels were compared to my memory. I didn’t have the patience to play it like I remembered, spending hours in a single level just being Spyro and pretending I had extra quests or things do to, like he was an action figure for whom I granted an infinitely unfolding internal monologue as I threw him about for hours and hours in the mud.

I only really thought about this difference in playing styles when watching a friend’s child play Mario (and a few other things) recently. It was interesting to see this same approach but from an adult perspective. He was naturally adept at playing the game and using the controls but he didn’t necessarily understand how to read the game’s environmental prompts for progression, instead treating it like a virtual toy box, developing an object-relation with the character on screen and playing out his own story lines as he saw fit, like an illiterate kid “reading” a picture book, making up their own narrative based on the pictures before them, wholly ignoring the worded guide and having no sense of the ways in which they’re usurping the object’s intended use.

Believe it or not, the mansion in the original Resident Evil was another example of this kind of sandbox for me. So was the Raccoon City portrayed in the series’ second and third outings.

It’s weird to think back to these games now in this context — to think that I was playing them at an age when I was young enough still to be toyboxing them — but my parents really did not seem to understand age restrictions. Thankfully, I was also aware of my own limits too. I loved Resident Evil but I left Silent Hill well alone until I was a bit older.

Just like in Spyro, these enclosed and claustrophobic environments felt really expansive within my imagination, and this was only exacerbated by the pervasive sense of fear they provoked. These games were so terrifying that I spent hours trying to buck up the courage to make the slightest bit of progression. The puzzles were also often way out of my league. Somehow, as a kid, I had the patience for playing the game without them.

This is probably why, when my Dad took me to see the Resident Evil film adaptation in the cinema the year it came out, I had no idea what was going on. Where the fuck did all this technology come from? Why was this Gothic adventure, set explicitly in the 20th century, somehow more 2001: A Space Odyssey than Night of the Living Dead?

As familiar as I was with the backstory of the Umbrella Corps’ genetic engineering and supersoldier creation — I loved to draw my favourite “character”: the Nemesis — I just didn’t care about any of that when playing the games how I wanted to play them. I really just liked the mansion and the overrun metropolis. Those were two of my favourite gaming environments ever.

When the HD remake came out on the GameCube — which I recently played again, in its further remastered edition, on the PS4 — I remember playing it a lot differently. After all, I was older; I was a teenager who better understood what he was in for when he loaded up that weird little MiniDisc.

I felt like I knew that mansion like the back of my hand — at least its initial sections — and I remember feeling weirdly disappointed when I got to the point of going underground and entering the Umbrella Corps’ labs. The same was true last year when I escaped the police station and made it underground in the brilliant remake of Resident Evil 2. (As indelibly as the police station was marked on my consciousness, I never made my way passed it in the original PS1 version of the game.)

I remember finding the anachronism so jarring. I remember suddenly being aware that in most narratives like this, the opposite trajectory usually unfolds: you start in the futuristic hi-tech lab and then go down to uncover some ancient conspiracy. This was particularly true when your progress took you underground — doesn’t down mean backwards? The subterranean connoting the past?

This was also the moment of hubris found within just about every action/adventure or horror film I loved growing up: The Thing‘s primordial alien, lying in wait; Indiana Jones combination of Nazis and ancient relics; the Tomb Raider series of films and games also had various storylines in which ancient powers were naively harnessed through modern technologies. There was a similar lesson within each version of this story: the future is not the master of the past; the planetary unconscious is eternal and it will bite you if you try stick a lead on it. But reversing the polarity of this Kurtzian expedition does strange things to that narrative. It doesn’t reverse the lesson; it just convolutes it… The linearity of traveling from present to past does not work in the same way when travelling from past to future. In hopping over the all-important present, the machine jams.

Nevertheless, there are obviously a few great examples of this anachronism put to good use — and it is worth emphasising that these are very much recent affairs. Cabin in the Woods might be the most perfect example for this context; Westworld is another — but Resident Evil still sticks in my craw as a jarring instance that doesn’t work so smoothly.

These games have a very particular way of dealing with their anachronism — a subtly that any and all film adaptations have wholly missed (the Tomb Raider film adaptations have also dealt with this combination of techno-relic pretty poorly, it must be said). They lose the video game’s sense of downwards progression.

I think the absence of puzzles in all film adaptations actually has a lot to do with this. Puzzles in survival horror games aren’t just quaint novelties but function as a vector for this templexity — the templexity of Gothic sliding bookshelf puzzles being made functional by technological cunning.

What does it mean that these haunted house puzzles, that would typically be the hobby of some eccentric eighteenth-century polymath in more familiar media, are instead part of a megacorp security system? It is a small instance where this time slippage makes sense. Puzzles are timeless; keys are universal, but they allow for a seed to be inserted where the polarity of your usual haunted house narrative is inverted.

Maybe this is purely cultural… When I first started thinking about this kind of survival horror anachronism, I thought: is it just a Japanese thing? Or maybe it’s just a Japanese-view-of-America thing? But then I considered the fact that the shoddy anachronisms of their uber American film adaptations are exacerbated primarily because of a shift in medium.

This kind of anachronistic cybergothicism makes sense in a video game, precisely because the medium progresses along with the latest advances in computer technologies. For many, advances in film CGI will never not be an intrusion — nothing will ever look as good as 2001‘s hand-made models or The Thing‘s bubblegum gore. The strength of film as a material for horror is the way in which it expresses materiality. (As a sidenote: of course it was David Lynch who would first make digital cameras work in the context of cinema by affirming their uncanniness in INLAND EMPIRE.)

So, given that video games are inherently machinic — a coded medium — perhaps it makes perfect sense that their horror matches the immateriality of the format itself: if you dig down beneath the surface aesthetics of a familiar Gothic, you’ll find circuitboard hardware and coded software.

But this isn’t Blade Runner, in which robotics becomes a screen — the machinic unconscious of video games is all too immanent. To dig below the haunted house you know into the megacorp you don’t is to reach into the corporation in your head. It is to tinker with the unconscious of now.

It was hard not to think about all this whilst playing through Capcom’s streamlined but lacklustre Resident Evil 3 remake under quarantine last week. What’s more, I was reminded of Felix Guattari’s introduction to The Machinic Unconscious:

We have the unconscious we deserve! […] I would see the unconscious … as something that we drag around with ourselves both in our gestures and daily objects, as well as on TV, that is part of the zeitgeist, and even, and perhaps especially, in our day-to-day problems. … Thus, the unconscious works inside individuals in their manner of perceiving the world and living their body, territory, and sex, as well as inside the couple, the family, school, neighbourhood, factories, stadiums, and universities… In other words, not simply an unconscious of the specialists of the unconscious, not simply an unconscious crystallized in the past, congealed in an institutional discourse, but, on the contrary, an unconscious turned towards the future whose screen would be none other than the possible itself […] Then why stick this label of “machinic unconscious” onto it? Simply to stress that it is populated not only with images and words, but also with all kinds of machinisms that lead it to produce and reproduce these images and words.

There is a intriguingly philosophical reason why all the Resident Evil games after RE4 and before RE7 were shit. RE4’s European adventure had a novelty to it, dipping into the viral cultural-unconscious of European (that is, proto-American) ancestry — a little view of history, no doubt, but a culturally effective on.

However, as soon as the series went to Africa, it stopped exploring that which was under-acknowledged and instead stumbled over a century’s post-colonial tropes of new savagery — ebola zombies in a land left ravaged by America that only America could fix. In this sense, these games dealt all too firmly with America’s conscience rather than its unconscious. It was clumsy and ill-fated.

RE7 brought the original cybergothic intrigue back to proceedings, injecting a contemporary class consciousness and fear of the bayou with a little bit of state military-industrial complex — echoing the rhizomatic unconscious of the Swamp Thing.

But, at its best, this series has always interrogated the new unconscious emerging at the dawn of the 21st century — the unconscious we newly deserved; an unconscious dragged from film to video games and transformed through the process, from screen to codes and circuitry. Once we dig down beneath the old horrors we know, we find they have a new constitution — and it is hypercapitalist, thoroughly corporate, and tellingly computational…

The real horror is that, once you master this, there is no Infinity Rocket Launcher to help you out of it…

Weed Picker

I haven’t played Animal Crossing: New Horizons yet, although I’d quite like to. (I need a Switch first.)

I can’t help but feel a certain dread when I think about it though.

As I scroll past other people’s clips and screenshots on Twitter, showing off all the fun they’re having with the game in quarantine, I just want to know one thing:

Do you still have to pick all the weeds?

I was a big fan of the original Animal Crossing, even before I’d played it. For a long time I’d wanted to try and get the Japanese import of the N64 version back in 2001 but rumours of a port were so rife I held out for years until they finally announced a European version of it for the GameCube in 2004.

I don’t know why but I had a thing for the quaint Japanese lifestyle simulators. Harvest Moon was another one. They were like a tonic to the drudgery of the school day. Goldeneye 64 was cool and Perfect Dark was bad ass but, sometimes, even a tween just wants to relax, you know?

The day it came out I was so psyched for what felt like a long, long holiday. After about a year of it I couldn’t go near it again.

The game was enchanting in all the same ways I’m sure the newest version is — albeit with a few less bells and whistles. But there was something else to it…

There was a pressure to it; a dark underbelly that tried to get inside your head. It was almost like living in Blue Velvet: the shiny veneer of suburbia held dark secrets and, just like in David Lynch’s unsettling masterpiece, those secrets weren’t just in dark alleys but waiting for you on your lawn.

It was nothing to do with Tom Nook. There were no communist memes wanting to send him to the guillotine back then. He was a little demanding and hard-nosed, sure, but he was also fairly easy to pay off and, once your house was as big as it could be, there was nothing but RNG and the game’s reliance on an external calendar stopping you from collecting everything and donating to your town’s museum.

This reliance on real time was novel and kept things interesting, but it was also its downfall. By relying on a schedule of real-world events, the game ingratiated itself into my daily routine. Even if it was just for 15 minutes, I felt the need to pop into my town every day to see what was new. But after a while, those 15 minutes weren’t enough.

It was because of the weeds. You always had to dig out the weeds.

I was fourteen years of age but by the time I was fifteen I felt like I knew what the life of a salaryman was like. My loyalty was to the company, or rather the village green preservation society. It was as tyrannical as any sovcorp.

My perceived lack of loyalty to the town brought a real sense of shame to my tiny abode. I even started receiving hate mail in my little bouncing postbox. As life outside the GameCube took over, the townsfolk refused to let my neglect go unacknowledged. Ironically, it ended up feeling like what real life — adult life — would soon become.

Again, Tom Nook wasn’t necessarily the enemy here. He was just an opportunist; cunning but as naive as the rest of them. It was the fact that the daily tasks and little quests had made incisive impositions upon the management of my time outside the context of the game. Soon enough, I wasn’t working to pay him off but simply hold together the fabric of this little society. If I didn’t do that, this community of animals quickly turned on each other.

I vividly remember loading up my save after a month-long exam period at school and finding that no one would talk to me. Everyone in my town would be angry and miserable. It was always because I wasn’t keeping up with the weeds. Drudgery was enforced by a needless moralism and an inequality of time. No one else took responsibility for their surroundings, after all. It was all left down to the new human whilst the animals in my midst leached off of my initial pride and turned it against me. It was like school had become my recreation and Animal Crossing was my job.

There was no escape. Choosing not to participate in the game began to feel like losing it. Breaks were allowed but you still had to play catch up. If you gave up entirely, good luck trying to get back in everyone’s good books.

I couldn’t play the game any longer. The demands it placed upon the player — the sense of responsibility — were too much. It was real life inverted. Soon, the therapeutic tranquility of Mattville, tainted by the drudgery of required labour, faded into the pixelated twilight until all that was left was darkness and disgruntlement.

I’d still try to load up the game on a Friday if I could bear it, so I could hear what song K. K. Slider had for us that week but, in the end, it felt like that dumb bohemian dog was just taunting me.

I thought that was the life I was going to live: roaming the towns, playing songs, swapping fossils, living carefree… It was all a dream — a futile, naive dream. That was K.K.’s role and his alone — the privileged nomad; the weekend hippy…

I was the weed picker. I had always been the weed picker and I always would be…

Criticism After Gaming: Notes on Pewdiepie, Cancel Culture and Reactionary Aesthetics

Is anyone else confused by the latest Pewdiepie drama? (Does anyone even care?) 

I have a long draft from 2018 on the topic of Pewdiepie lurking somewhere in the bowels of my WordPress account. I watch his videos a lot and I’ve talked about him a couple of times on the blog before — most extensively in the second part of my “All Roads Lead to Alienation” series

Controversies aside, I’ve long admired Felix Kjellberg’s transparency about his mental health issues and I find him to be an interesting weathervane for the shifting hot air of online subcultures. Any insights that might be drawn from this, however, are often lost to his unfortunate nature as a pretty stereotypical Scandinavian millennial who lacks any rigorous media training, coupled with a naive perception of his own whiteness that, from experience, seems to be pretty common amongst Northern Europeans. (Not that this excuses his worst outbursts but it seems to be a bigger issue than just him.) 

Beyond this, I think his strange position — as the world’s most popular YouTuber who is nonetheless seen as the figurehead for an otherwise marginal fanbase — says something interesting about our online spaces (although what exactly is being said is hardly clear from the outside). His relatable nature seems to come from the fact he is a product of early 21st century online cultures instead of being the kind of movement leader he is often heralded as being given the size of his audience. 

This is to suggest that, although he is regularly mentioned in the same breath as Infowars and Breitbart Media, it seems to me like he has more been influenced by a broader Silicon Valley neo-neoliberalism rather than spreading an ideology of his own. As far as I can tell, this is because he’s online, echoing a worldview that I do not share but one which is incredibly prevalent outside of mainstream discourses, and there are likewise many other cultural pies that suffer from the same issues in which he has his pewdie fingers. I feel like his latest controversy epitomises this.

After reaching a mind-boggling 100 million subscribers, Kjellberg made a video unboxing an award sent to him by YouTube, during which he committed to donate $50,000 to the Anti-Defamation League as a (somewhat fleeting) gesture of goodwill following his previous controversies, during which he has been adopted as a meme by alt-right and white supremacist figures, as well as more personal accusations of anti-semitism. 

Soon after this video went up, there was apparently an enormous fan backlash and, in a follow-up video, Kjellberg decided to rescind his gesture, saying he would donate the money somewhere else.

I didn’t see any of this backlash personally. (That’s an alien part of Twitter and Reddit to me and the part I’m in is weird enough without heading over there.) All I saw were the videos themselves. But my girlfriend and I nonetheless spent much of this morning trying to figure out what was going on after even she became aware of there being some sort of controversy brewing when the story reached the front pages of a lot of national news websites (such as The Guardian and the BBC).

Knowing I watch his videos, she messaged me to see if I knew what the problem was — was the controversy that he was donating to the ADL in the first place, with their history of equating antisemitism with criticism of the state of Israel and other dubious political stances? Or was it that he cowed to the pressure of his supposedly alt-right fans? 

What’s interesting to me is that it seems to be a mixture of both and I see this tension are being central to his online existence. It likewise seems to be a sticky situation that only he occupies.

This is part of the reason why I find myself following Pewdiepie’s content so closely. His high profile under the YouTube spotlight means he often appears caught in the middle of our contemporary culture wars. He’s not a part of LeftTube, nor is he explicitly a part of YouTube’s extremism problem. His personal politics certainly seem to lean to the right but he also seems to be at the mercy of both left- and right-wing cancel cultures, and the latter is a form of cancel culture that few in the media seem to fully understand the dynamics and concerns of. Controversies such as this make that explicitly clear, illuminating a broad cultural crisis that is seldom acknowledged.

First, perhaps we need a quick gaming culture recap… The current toxicity of gaming culture seems to be a result of the long shadow left by #GamerGate, the ultimate cultural backlash of the 2010s, and this has routinely been a topic that mainstream news outlets have dedicated summative essays to, trying to explain the controversy to non-gamers. It will never not be strange to me how a billion dollar industry can still be considered to be so politically niche.

I can’t remember if I’ve ever written about this on the blog here before but #GamerGate was something I fell foul of back in 2014 when it first exploded. On an old personal Twitter account, I made the mistake of criticising many of the women who were expressing pro-#GamerGate opinions on the hashtag by using the hashtag myself and throwing into the fray some sort of half-baked 140-character missive about the blatant false consciousness of these female #Gamergaters.

The tweet destroyed my mentions for a whole 48 hours, with the subsequent reactionary pile-on leading to a complete shutdown of my social media pages for another week afterwards as I let things blow over. Commenting of female false consciousness was perhaps, more broadly, not a good look, but it was #GamerGaters only who took issue with it, and being targeted caught me completely by surprise.

I didn’t have any sort of platform at that time. I’d simply waded into the hashtag without thinking and managed to successively piss off all the wrong people as the tweet got shared by hundreds across a combined network of thousands upon thousands. I remember at first reading all the responses whilst sat in my car under the Humber Bridge — a frequent hangout spot when I was living in Hull between 2014 and 2016 and with a very difficult situation at home. I counted the rapid-fire notifications with incredulity as I was hit with literally hundreds of replies a minute. The onslaught lasted for hours. It was terrifying and induced repetitive panic attacks for days. I had to completely unplug to get away from it. No phone or laptop. I almost threw the former straight into the River Humber. I went into complete digital isolation.

In hindsight, it was a telling experience. Here was a broadly reactionary cause that was emboldened by using the same pile-on tactics that the left have now become most infamous for and, after later experiencing it on the other side of the political divide, it is clear to me that this is a contemporaneous and generational issue rather than a fault in any singular political movement — a symptom of rampant neoliberalism with its risk-adverse politics of individualism. As such, it seems to be a dynamic that has defined all of politics over the last five or so years, despite its predominant association with the left. 

This is worth emphasising, I think, and paying closer attention to it. There is a sense that, in some corners of the internet, this latest Pewdiepie drama is a damned-if-he-does-damned-if-he-doesn’t scenario, straddling both sides of the political divide, as well as the divide between gamer culture and its outside.

That’s not to say Kjellberg’s “gamer” fanbase isn’t something of a pressure pot for certain types of politics — it evidently is — but that is precisely why it is interesting to me. The question is: why? And the answer, I think, is not because of online echo chambers and the rise of the an online right. It is symptomatic of a broader cultural — and even aesthetic — moment that should concern us all. 

Here is an industry where a broadly reactionary fanbase has the sort of clout that the left has with other mediums. And what is interesting is that this is compartmentalised as an alt-right issue by the wider media when, in fact, gaming seems to have just been mutated by a broader cultural industry at large. 

When Mark Fisher, Simon Reynolds and the rest of the early ’00s blogosphere were in the midst of their hauntological moment, for instance, they were considering the ways in which old and established mediums were mutating due to the feedback loop of late capitalist cultural production. Reynolds’ book Retromania, in particular, explored the ouroboros of 21st century music cultures that were endlessly recycling themselves. However, whilst this is where much of the discussion remains, the same cannibalistic dynamics can be seen at play in cinema and visual art as well.

But gaming seems to have been overlooked throughout these discussions because, rather than suffering from slip into retromania, it instead came of age in that moment. This is to say that the gaming industry has internalised a broader cultural retromania to a far more insidious extent, making it the cybergothic industry par excellence today with its accelerative attitude towards technological innovation but with a largely reactionary view of its own broader cultural development.

I think a large part of this has to do with the gaming industry’s own attempts to critically legitimise itself through a development of its own modes of criticism — and this was the central focus of the #GamerGate controversy, lest we forget. This is important because criticism is — and always has been — political, but the gaming industry’s rushed attempts to give itself critical legitimacy have led to a general naivety about criticism’s role in their own culture. This, again, is due to the time in which gamers and their medium came of age — at a time when everyone was becoming a critic and criticism itself had supposedly been de-rigorised and democratised, for better and for worse.

Despite (or perhaps because of) this, video game criticism and journalism still have a long, long way to go in terms of their cultural standing and, like the industry itself, it finds itself speeding ahead as it tries to retroactively apply outdated critical standards to its own development in order to legitimise itself. (It is a critical forestalling that we’ve seen before — I have a whole other theory about this, for another time, exploring how photography went through a similar in-grown period of critical development which has only worsened its internal elitism today as an art form.)

To explain what I think the impact of this is, I want to foreshadow a future post I have in the works, returning to my current favourite literary critic Leslie Fiedler. 

I recently discovered the YouTube archive of the long-running political talk show, Firing Line with William F. Buckley, Jr., on which Fiedler appeared in the 1970s. It’s a brilliant conversation that Buckley and Fiedler have, and at one point Fiedler even echoes a kind of proto-K-Punk perspective on popular modernism and the divide between high and low cultures, noting how the very emergence of this divide can be documented in tandem to the emergence of literary criticism as a whole.

Fiedler explains that it was in the middle of the eighteenth century that literary criticism first began to “assume its dominance”, at that time when

class had assumed social and economic power that was culturally insecure, and the new middle class, the new bourgeoisie, wanted people to write dictionaries to tell them how to spell words, etiquette books to tell them which fork to pick up, grammar books to tell them they weren’t supposed to say ain’t anymore, and critical books to tell them whether it was okay to read novels to begin with and, if so, which novels were more okay than others.

Those people were sent off to school to study the classics but then came back to talk to their masters about what they should read of current literature, especially to talk about the form that was invented at that moment in the middle of the eighteenth century — what has become the dominant literary form — the novel. And then, after a while, what happened was the people who were entrusted with writing guidebooks … began to get very high and mighty about what they were doing…

Critics were getting high on their own perceived authority but, at the same time, much of society was also ignoring their critical appraisals and indictments. Fiedler highlights, for instance, the persistent popularity of pornography, explaining how literary pornography has always been popular with all socioeconomic classes but always read privately and shamefully — the first (and still our biggest) guilty pleasure.

Later, moving forwards to the 20th century, Fiedler describes how the centrality of the university and pedagogic institutions more generally perpetuated the bourgeois elitism of criticism — an issue he, notably, also points out as prevalent in academic Marxism. He notes that

the determination of what was literature got turned into the question of what is taught in classes is literature — literature is what is taught in classes in literature — what is taught in classes in literature? Literature. It’s a perfectly circular definition which gets you no place.

What interests Fiedler about this is the extent to which this capture of criticism by the academy has made it easier to ignore so that quote-unquote “trash” that is critically maligned nonetheless persists and becomes a major cultural reference point for people. He continues:

If [Fiedler’s students] say to me as a critic: ‘Why do you think Dracula survived although it doesn’t come up to specifications in terms of its language and form and so forth?’ …

Certain books, which may be pretty low on instruction and don’t even ‘delight’ in the ordinary sense of the word… Some of those books may do something else to you, which is to say they may touch that essential archetypal mythological material which is in all of our minds and which is the one thing that keeps us together. This is a populist line I’m giving you…

Sometimes it seems that … all our conscious ideas separate us. You and I, if we discussed many things about politics for instance, might find we disagree but if we were to swap nightmare stories I bet we would discover that there are places where we live in the same region.

It is in this sense that horror and the gothic, in their ostensibly pulp modes, persist within our cultural imaginations despite their distance from contemporary critical trends. Horror movies occupy the same existence — critically trash but culturally central. Such is pop culture more broadly and such is gaming culture most explicitly today.

As a long-maligned medium that nonetheless attracts and is popular with millions, gaming culture has attempted to develop its critical thinking in reverse and, as a result, has dragged along much of the reactionary thought that defines criticism historically, albeit inverting it in apparently counter-intuitive ways. It has led to a kind of inverse elitism where academicism and capital-C criticism are blocked from having too much of an impact on the medium itself, and it is in this rock and a hard place that the main figureheads of gaming popularity like Pewdiepie find themselves, caught between a reactionary fanbase and an old-style critical media discourse. 

What needs to be considered in more detail is the way in which a reactionary culture — common to the “fanbases” of countless mediums — is conflated with a reactionary politics. This is more obvious to us now with the popular genres of yesteryear. I’m reminded, for instance, of Noel Gallagher’s recent trashing of Jeremy Corbyn and his nostalgia for Britpop’s tandem ascension with New Labour.

With gaming culture, the Venn diagram between reactionary culture and politics seems to reveal a considerable overlap, but the two are not mutually exclusive. We should pay closer attention to the ways in which seemingly innocuous aesthetic nostalgia is wrapped up with the rise in reactionary politics because there is a sense in which those critical institutions attempting to hold Pewdiepie to account are more responsible for the present situation than they are dare themselves to be aware of. The gap into which he falls is a more of a direct result of a persistent subcultural retromania than any alt-right movement grown in a vacuum.

This isn’t to shift responsibility but I think a more nuanced awareness of where these issues have arisen from will give us a far better chance of combatting their increased presence. Clambering around the surface of impotently spherical definitions of our cultural trends and warring cancel cultures is not going to get us anywhere.


Last week, I promised that when I hit 3000 Twitter followers I would show you all my Minecraft world to celebrate.

I thought that would take a few weeks but the recent viral tweet led to an extra 100+ followers overnight so here we are. Kicking off at 1600.


Recommended by Robin via his son. A 16-part Let’s Play series which explores a strange and seemingly unfinished PS1 game called Petscop, in which you play a cop who catches pets… At least, that’s the initial set-up until you find the way into the netherworld.

As one YouTube commenter suggested, this is Animal Crossing meets Silent Hill — and it’s glorious.

Non-Normative Gothic (or, Stuff I Like)

I get asked for film and TV recommendations a lot on CuriousCat and I’m never really sure what to say. More often than not, I ignore them, because it ultimately feels quite arbitrary.

I watch everything. Or try to. I used to literally watch everything and my threshold for liking things was low. I paid my dues with French New Wave or Polish Slow Cinema or whatever else. My favourite directors were Kieślowski, Bergman and Lynch but I don’t really want to be the guy who still recommends that stuff at the drop of a hat into his late 20s. (Although, of course, I still think they’re all great.)

If 18-year-old Film Bro me was to give you a list of films that were really influential for me, it would look like this:

A Short Film About Killing (1988, Krzysztof Kieślowski)
The Hour of the Wolf (1968, Ingmar Bergman)
The Devil Probably (1977, Robert Bresson)
The Sentinel (1977, Michael Winner)
The Silence of the Lambs (1991, Jonathan Demme)
The Thing (1982, John Carpenter)
Apocalypse Now (1979, Francis Ford Coppola)
Possession (1981, Andrzej Żuławski)
The Night of the Hunter (1955, Charles Laughton)
INLAND EMPIRE (2006, David Lynch)
Vertigo (1958, Alfred Hitchcock)
The Invasion of the Body Snatchers (1978, Philip Kaufman)
Kwaidan (1964, Masaki Kobayashi)
Who’s Afraid of Virginia Woolf? (1966, Mike Nicholls)
Don’t Look Now (1973, Nicholas Roeg)

I’d still stand by that list, I reckon, but I’m wary of saying it is definitive because I haven’t seen most of these films (except The Thing, which still gets frequent outings) within the last 5 years — 10 years for some. As such, I could just keep going. I’ve seen a lot of films and I’ve liked a lot of films because I was a teenage sponge and there comes a point where a list just becomes redundant because it’s whatever comes to mind first. I don’t want to equate my taste with the effectiveness of my memory. Nowadays, if I watch something and it makes me feel something out of the ordinary, I’ll probably find something to write about it right here.

Beyond this connoisseur-appropriate list, I’ve also really liked The Hunger Games trilogy, Denis Villeneuve’s Prisoners and David Fincher movies — Zodiac, Alien 3 and The Girl with the Dragon Tattoo. I really like Michael Mann’s Collateral — the first (and only) movie I ever saw on a plane! I like the most recent run of Marvel movies — which have finally found their stride, I think, after a load of money-grabbing. The last three films I saw and really liked were The Favourite, Lady Bird and Three Billboards Outside Ebbing, Missouri.

I could have just said all this when an anon asked earlier if I could recommend some “Gothic media essentials” and, whilst I’d otherwise be happy to, it felt like a good opportunity to offer some broader thoughts on tastes and xenogothic media. Because not all these things are recognisably Gothic and making a list doesn’t really do enough in terms of clarifying that I actually think about the Gothic (and why this blog is called Xenogothic).

I like finding the Gothic in all the telly I watch. My view of the Gothic isn’t that normative because I don’t think the Gothic is — or, rather, it shouldn’t be — that normative. At its best, it ruptures itself. The best examples of the Gothic, for me, are often thrillers and murder mysteries rather than horror movies. More often than not, I end up chatting about the latest murder mystery on Netflix than the latest jump-scare-athon. I’m a big fan of Robin Mackay’s writing on yarnwork in this regard and Robin might be the person I talk to about TV and films the most. (In fact, we shared a folk horror kick last year, watching Blood on Satan’s Claw at Urbanomic’s Cornwall HQ.) He once wrote:

The international thriller and the detective story … present us with a localised object or event that stands out from the ground of normality, suggesting forces as yet unaccounted for. At the same time they transform that vision through abrupt shifts in perspective — the ‘plot twists’ that are the stock in trade of such narratives. This continual interrogation appeals in part because it models the predicament of finite, situated cognition and its aspirations toward universal purchase.

Gothic media essentials are, then, a misnomer for me. It’s about rupturing normality, not finding the best examples of a norm. What I’m interested in is being attuned to the weird as we can find it in the here and now, and the now and then. And there are plenty of examples of media that do that, albeit not being readily seen as “Gothic”. “American Horror Story” never quite got my vote, for example, because it felt so invested in heavy-handed genre tropes. I much prefer the neo-baroque of “Hannibal“, for instance, or, most recently, I liked that new adaptation of “The Haunting of Hill House“. Another series I can’t stop thinking about is “Children of the Stones“, particularly for the way that Mark wrote about it on the Hyperstition blog, tapping into a vigilant and militant dysphoria.

I’ve been interested in finding this sort of thing in all kind of films, mostly recently planning to find the American Gothic in Westerns.

Books are the same. (I’ve written about recent likes here.) Games too. (Here.) All media is the same.

Non-normative gothic is the most gothic.

Notes on Resident Evil 2

I’ve been struggling with a cold for a week now and it’s mutated into a horrible throat infection so the Reza posts are on hold until I feel like I’ve got the brain power to move forwards with them.

However, speaking of mutating viruses, Resident Evil 2 has arrived in the post and I’m gonna be sinking the limited energy I do have into that game over the next week or so whilst these antibiotics kick in.

I wanted to write about it because, before this cold got worse, I’d promised to stream it. I’ve decided against that now because I can’t talk and don’t want to hold off on playing it for the sake of a video I’ll probably never finish so I thought I’d be better to write up my thoughts on the blog instead.

(I’ll get round to finishing one of my gaming video essays one day — the Bloodborne one stalled months back but it remains promising…)

I’ve had a bumpy relationship with the Resident Evil games. I was reminded of my love for them when I was back at my home over Christmas, digging around all my long-forgotten childhood things and finding a complete run of Resident Evil games released on the first and second generation PlayStation consoles: that’s the first game, the “Directors Cut”, number two, number 3, Survivor, and Code Veronica.

Not counting the Gamecube remaster of the original game, there is an abrupt stop in my engagement with this franchise after this point.

This abrupt stop is no doubt down to the PS2 becoming the console for Silent Hill games whereas Resident Evil had ruled the PS1. The first Silent Hill on the latter platform was mythically horrific to my childhood brain and I didn’t play it until a few years after it came out — when I felt “ready” for it. I remember gaming magazines would talk about it in the same sort of terms as a snuff film. What’s even more memorable is that, when I finally did play that first Silent Hill game, I remember it far exceeding the horrors conjured up by my imagination. It was, at that time in my life, quite literally more terrifying than I could imagine.

It scared me in a way that the Resident Evil games had never managed to do. Zombies were fun and they remain my favourite pop horror archetype but Silent Hill got deep inside my head. And so, Silent Hill 2 and Silent Hill 3 ruled my PS2 from there on out because all Resident Evil games after Code Veronica were trash as far as I could tell and they never got a look-in. (Although I regret that I’ve still never played Resident Evil 4.)

I think things went sour for me after the release of the Resident Evil movie adaptation. Stylistically, the film was grotesquely over-influenced by The Matrix. I remember leaving the cinema (having snuck in underage to see it) and feeling like I had recognised nothing of the experience I hoped to see replicated. And then the games following the movie seemed to echo its approach to the franchise’s universe.

However, my distaste for this overly influential cinematic divergence might also be down to the fact that, in my head, I’d always downplayed the role of the Umbrella Corporation — that’s the evil pharmaceutical company at the heart of the franchise, responsible for creating the zombifying T-Virus as a bioweapon to make invincible soldiers which leaked out from their headquarters beneath Raccoon City, seemingly going on to infect 95% of the local population. If that makes Umbrella sound like a hard thing to ignore in this series, you’d be right, but I’d nonetheless get fixated on the environments and the zombie killing and ignore the story all together, as is a no doubt common tendency amongst kids playing way below the advised minimum age limit on their games. For me, back then, the story was background noise to the thrills I was there to receive.

Don’t get me wrong though: I think the idea of a mutated virus is good. It’s noumenal and taps into a historic human fear — a kind of Black Death irrationalism where illness is, in many ways, seen as a haunting inevitability and the things done to resist it are rooted more in superstition than medical science. It’s where that lines blurs that zombie apocalypse movies really hold their own and so of course it’s the most common cause of zombie apocalypses throughout popular culture. (The Walking Dead‘s first seasons captured this atmosphere and its existential despair best, if I remember correctly.)

However, whilst making the source of this noumenal virus the stupidity, greed and recklessness of corporate America isn’t a bad message in and of itself, it always felt really lame to me; cartoonish and unnecessary. Zombies are, on their own, more than enough. Adding Big Pharma to the equation both waters down and constipates the symbolism. It makes Umbrella a largely unseen enemy, reducing the zombies themselves to an eternally irritating smokescreen that persistently distracts you from the threat at large. You can’t get to Umbrella because you’re constantly hampered by the mess they’ve made. In this way, the series downfall was always inevitable. It set itself up for a fall into lame action archetypes when it made its main enemy largely untouchable — an unsustainable premise in the long run: the games had to become more corporate in themselves.

Saying that, Resident Evil 7 was an incredible experience, playing up to the haunted house vibe that made the original so good and making the Umbrella-infused finale far less like corporate espionage and a lot more Lovecraftian, making it feel like a genuinely satisfactory and supernatural conclusion, resisting the errors of previous instalments which made Umbrella the central part of the plot overall.

However, even today, the very existence of Umbrella just disinterests me. Personally, I don’t need to know the cause of the terrors on screen. It’s the not-knowing that makes it so unnerving in the first place and I don’t actually want that taken away from me. Plus, building a franchise around the outbreak’s narrative cause — the military-industrial complex no less — was always a weak move in my opinion that reeked of bad Hollywood action movies. (It’s the same reason why Aliens is the worst Alien movie — don’t @ me — there’s just something about a premise of “mindless drones versus mindless drones” which doesn’t appeal to me.) I’m not here to have my masculinity massaged by my undead killing spree, I’m here to have my very sense of humanity unsettled.

That’s what’s so interesting about the premise of the very first Resident Evil game. You have a very (very) stereotypical 80s/90s Action Hero cast — made up of precisely the kind of testosterone bozos found in James Cameron’s attempt at a Big Dick Energy Alien movie — who are then thrown into what is a very Japanese haunted house scenario; a place where folklore and modern society rub up against each other uncomfortably.

There is a sense that these bum boils of American masculinity travel through a kind of time warp and that was what made the game so scary: this sense of utter displacement — the silent, arcane, folkloric mansion being intruded upon by a cyberfascist futurity (– that’s in reference to both the goodies and the baddies, FYI.)

In many ways, it feels reminiscent of 1977 cult classic Hausu, the Japanese haunted house horror film directed by Nobuhiko Obayashi. Obayashi was primarily a director of film and television ads before making Hausu, and the film would be nowhere near as surreal were it not filmed in the cinematographic language of advertising. Resident Evil is the same — transplanting the language of the American action movie into the Japanese haunted house for a similar effect.

These games have always been fun to play despite these very personal plot issues that I have with them, so generally it always feels overly nerdy to get hung up on them. I only mention all of this now because the remake of Resident Evil 2 is the first game in this franchise not to make me cringe when Umbrella becomes the main plot focus over and above your own basic survival.

Umbrella remain a constant presence during the game’s final two thirds, but something about the presentation here changes things. Whether this is rose-tinted (read: HD) nostalgia, an improvement or just a long-held grudge with this series thawing out, there’s something really interesting about this game and its plot — particularly what its bizarre cultural cross-pollination has to say about the world(s) in which it is set. Whereas the Silent Hill series was set in various quintessentially American locations, probing the inside of the American psyche in the process, Resident Evil 2 transplants what feels like a quintessentially Japanese perspective into an American(ised) location where it jars in fascinating ways, precisely because you have this same transplanting of fears, conspiracies and cultural signifiers across cultures.

Now that I’ve got those nostalgic reflections out the way, in the next post I want to talk about just what this remastered perspective says about this seminal Japanese view of an Americanised crisis; of a sovcorp dissolved into a zombie nation…

Yes, I might use Resident Evil 2 to critique Moldbuggian patchwork

To be continued…

UPDATE: I sort of want to eat my words from yesterday. When I wrote that post, I was in bed, having just completed the first four hours of the campaign and about to play as Ada in the sewers.

Now, having just completed the sewers. I’m left with a gross taste in my mouth.

I was slightly taken aback by this sequence because I remember hearing something somewhere about Ada’s character being “fixed”. Perhaps that’s just because of this initial costume teaser.

What appears like an improvement on paper comes across as a hamfished film noir homage in reality and then deteriorates from there onwards when, after entering the sewers, she loses the coat and ends up navigating shit streams in a very short and very tight red party dress and a choker…?

We’re all used to seeing women on screen in action roles wearing high heels throughout the entirety of their ordeals, magically without breaking any ankles, but this was really gratuitous, especially during the scenes where she was side by side with Leon, the rookie, with all hip pouches, tools and weapons. Ada is meant to be this superior and mysterious FBI agent but she comes across like some really bad cosplayer.

Then, when Leon and Ada seem to fall in love as they enter the belly of the beast, the cringe peaked. It’s the sort of bad dialogue that you expected from these games in the late 1990s but updated to this level of technical and aesthetic beauty, the outdated narrative comes across even worse than before — even in 1998 you could at least laugh at it.

Hears hoping my play-through of Claire Redfield’s narrative is more palatable.

UPDATE 2: I finished the game in a reasonable 6.5 hours from my sick bed. Unfortunately, I still agree with my childhood self — the police station is one of the best survival horror locations ever and, whilst the gameplay remains fun, the locations that follow it aren’t a scratch on where you start. All in all, a bit disappointed.