Game designer at Naughty Dog, software engineer, Canadian abroad
641 stories

The Exit 8 Is The Best $4 You Will Spend Trapped In The Japanese Subway System


Welcome to OCD Hell, I hope you like checking tiles.

The post The Exit 8 Is The Best $4 You Will Spend Trapped In The Japanese Subway System appeared first on Aftermath.

Read the whole story
2 days ago
Santa Monica, California
Share this story

Tournament Design as Game Design

1 Share

Diagram of the Swiss stage of the Worlds 2023 tournament, described in detail in article.Diagram via LoL Esports

Back in 2011, a friend gently guided me through the steep learning curve of League of Legends, and I’ve been playing it on and off ever since. For about as long, I’ve been casually following the competitive LoL esports scene, especially enjoying the Worlds championship tournament that caps off each year. Worlds 2023 has been particularly great thanks to a big change: Riot Games replaced the traditional “groups” stage with a “Swiss” format stage instead. Examining this change through a game design lens helps reveal why it has been so successful and impactful.

For years, the essential structure of the Worlds tournament has been the same. In the “groups” stage, teams are randomly drawn into smaller groups (usually four groups of four), and the top two teams from each group advance to the “knockout” stage quarterfinals. In 2014, the concept of “pools” was added so that teams would be seeded for groups based on their regional standings. This structure has worked reasonably well and created some terrific memorable tournaments, but one persistent awkward issue has been the randomness of the initial groups drawing.

For game designers, randomness is a tool and not necessarily a bad thing. It’s also not a unitary property; there are many different flavours of randomness, as I explored in a previous article about card design in Hearthstone. For designing a tournament, there are many upsides to randomness: it has the potential to create exciting matchups and surprising upsets, and a random draw is “fair” to all competitors.

However, a tournament has two primary objectives: to crown the best team based on skill, and to showcase exciting matches for the spectators. The fact that the initial random drawing for groups has an outsize impact on the overall outcome is inimical to both of these goals.

Because of the large skill discrepancies between teams and regions, the chance placement into groups can effectively predetermine who will advance. In mathematics this is called “sensitive dependence on initial conditions”. For instance, sometimes four strong teams are drawn into the same group. This creates a fiercely competitive “group of death” with a hard road for any of them to advance to quarterfinals. The opposite is an unbalanced “group of life” with two strong teams and two weak teams. This produces boring one-sided matchups and a practically foregone outcome. Even worse, it can set up pointless matches for teams that have already been mathematically eliminated.

From a game design perspective then, what is the fundamental flaw here? To quote Mark Rosewater: “randomness cannot be the destination; it has to be the journey.” Randomness is fun when players have a chance to respond to it. An ideal tournament format would not sharply delimit the range of possible outcomes with the initial random draw, but instead empower teams to change their destinies through their hard fought victories and defeats. From the spectator’s point of view, it should also produce close matchups and provide high stakes.

For comparison, let’s examine the “Swiss” stage which replaced the “groups” stage at Worlds 2023. The structure of this format was developed in the 1890’s for chess tournaments. In the first round, teams are randomly paired (against a region other than their own). In each subsequent round, teams are randomly paired against teams with the same win-loss record. This continues for five rounds until each team advances (with 3 wins) or is eliminated (with 3 losses); elimination and advancement matches are best-of-3.

For example, in round 3, there will necessarily be four teams with 2 wins, eight teams with 1 win / 1 loss, and four teams with 2 losses. A random draw will pair each team against another team with an identical win-loss record. A 2-0 team that wins advances to the next stage; a 0-2 team that loses is eliminated. All other teams continue into round 4 with either a 2-1 or a 1-2 record, and those records determine the next random pairing.

This structure provides a beautiful mix of positive and negative feedback that any game designer would recognize approvingly. Teams that win get closer to advancement, but also place themselves into the more competitive “group” of other winners (negative feedback). When winning teams qualify for the next stage, they no longer stick around to act as spoilers for the remaining matches (negative feedback). On the other hand, weaker teams eliminated early no longer provide “easy wins” (positive feedback).

Note that both formats rely heavily on random drawings for determining matchups. The critical difference is that the “groups” format fully resolves all randomization before the first match is played. In the “Swiss” format, something teams can control (their win-loss record) is fed back into the randomization. Teams are no longer trapped in groups of life or death, but chart a course through “fair” matchups driven by their skillful play.

Note that this format does not eliminate “luck” in draws; compare NRG’s easy path through the Swiss stage with KT Rolster’s slog. Recall, however, that a tournament has a second primary goal: to create excitement for the spectators. In this regard, the Swiss format has two significant benefits.

Firstly, the format optimizes for matchups between teams that are close in skill. The strongest and weakest teams quickly diverge into separate trajectories, reducing the odds of a one-sided stomp as the tournament advances. Secondly, it ensures that every match is meaningful to the outcome of the tournament. There are no lopsided groups or predetermined outcomes; every match is a fight to advance or a fight to survive.

Games that incorporate randomness need to do so with great care and intentionality, and it’s been fascinating to see a “metagame” of tournament design incorporate those same design considerations. Riot Games deserves credit for shaking up their longstanding competitive structure, and I’m excited to continue following Worlds 2023 through the knockout stage.

Read the whole story
42 days ago
Santa Monica, California
Share this story

The poster’s guide to the internet of the future

1 Share

For the last two decades, our social networking and social media platforms have been universes unto themselves. Each has its own social graph, charting who you follow and who follows you. Each has its own feed, its own algorithms, its own apps, and its own user interfaces (though they’ve all pretty much landed on the same aesthetics over time). Each also has its own publishing tools, its own character limits, its own image filters. Being online means constantly flitting between these places and their ever-shifting sets of rules and norms.

Now, though, we may be at the beginning of a new era. Instead of a half-dozen platforms competing to own your entire life, apps like Mastodon, Bluesky, Pixelfed, Lemmy, and others are building a more interconnected social ecosystem. If this ActivityPub-fueled change takes off, it will break every social network into a thousand pieces. All posts, of all types, will be separated from their platforms. We’ll get new tools for creating those posts, new tools for reading them, new tools for organizing them, and new tools for moderating them and sharing them and remixing them and everything else besides. 

All that change could be hugely exciting, but it raises a complicated question. If you’re a person who posts — and by “posts,” I mean creates everything from tweets to TikToks for lulz or for a living — what do you do now? For two decades, the answer has been relatively straightforward: if you want to post somewhere, you log in to that platform, use its tools, and click publish. Going forward, in a vastly more open and decentralized world, how do the posters post?

POSSE and the future of posting are also the subjects of the most recent Vergecast episode. Subscribe here.

The answer, I think, lies in a decade-old idea about how to organize the internet. It’s called POSSE: Publish (on your) Own Site, Syndicate Everywhere. (Sometimes the P is also “Post,” and the E can be “Elsewhere.” The idea is the same either way.) The idea is that you, the poster, should post on a website that you own. Not an app that can go away and take all your posts with it, not a platform with ever-shifting rules and algorithms. Your website. But people who want to read or watch or listen to or look at your posts can do that almost anywhere because your content is syndicated to all those platforms.

There have been people talking about POSSE, and practicing it on their own sites, for years now. (If you want a good example of how it works, check out Tantek Celik’s blog — Celik is one of the early POSSE believers in the IndieWeb community, and his website shows what it looks like in practice.) But as platforms grew and raised their garden walls ever higher, the open web gave way to centralized platforms in a big way. In the last year or so, though, particularly after Elon Musk’s Twitter acquisition alerted users to how quickly their platforms can change or die, POSSE has gotten some traction again alongside ActivityPub and other more open ideas.

In a POSSE world, everybody owns a domain name, and everybody has a blog

In a POSSE world, everybody owns a domain name, and everybody has a blog. (I’m defining “blog” pretty loosely here — just as a place on the internet where you post your stuff and others consume it.) When you want to post something, you do it to your blog. Then, your long blog post might be broken into chunks and posted as a thread on X and Mastodon and Threads. The whole thing might go to your Medium page and your Tumblr and your LinkedIn profile, too. If you post a photo, it might go straight to Instagram, and a vertical video would whoosh straight to TikTok, Reels, and Shorts. Your post appears natively on all of those platforms, typically with some kind of link back to your blog. And your blog becomes the hub for everything, your main home on the internet.

Done right, POSSE is the best of all posting worlds. “As someone publishing, I want as much interaction as possible,” says Matt Mullenweg, the CEO of Automattic and one of the most important people working on WordPress. (Automattic also owns Tumblr, another of the internet’s biggest posting platforms.) “So why are you making me choose which network it goes to? I should post it once, ideally to my domain, and then it goes to X and Threads and Tumblr and all the other networks that can have all their own interfaces and network effects and everything like that. But my thoughts should go to all those places.”

Image: Tantek Celik / David Pierce

POSSE makes sense, both philosophically — of course you should own your content and have a centralized home on the web — and logistically. Managing a half-dozen identities on a half-dozen platforms is too much work! 

But there are some big challenges to the idea. The first is the social side of social media: what do you do with all the likes, replies, comments, and everything else that comes with your posts? POSSE is a great unifier for posting but splinters engagement into countless confusing pieces. There’s also the question of what it means to post the same thing to a dozen different platforms. Platforms have their own norms, their own audiences, their own languages. How often do you actually want to post the same stuff on LinkedIn and on Tumblr? And if you do, at what point are you indistinguishable from spam?

The most immediate question, though, is simply how to build a POSSE system that works. POSSE’s problems start at the very beginning: it requires owning your own website, which means buying a domain and worrying about DNS records and figuring out web hosts, and by now, you’ve already lost the vast majority of people who would rather just type a username and password into some free Meta platform. 

Even those willing and able to do the technical work can struggle to make POSSE work

Even those willing and able to do the technical work can struggle to make POSSE work. “When I started,” says Cory Doctorow, an activist and author who has been blogging for decades and recently set up a new POSSE-ified blog called Pluralistic, “I literally had an HTML template in the default Linux editor. I’ve got Emacs key bindings on and I just literally would open that file and resave it with a different file name, append the day’s date to it, and then write a bunch of blog posts in this template. And then I would copy and paste those into Twitter’s threading tool, and Mastodon, and Tumblr, and Medium, one at a time, individually editing as I went, doing a lot of whatever, and then I would turn it into a text file that I would paste into an email that I would send to a Mailman instance where I was hosting a newsletter. And then I had full-text RSS as well, and Discourse for comments, which has its own syndication for people to follow you on discourse.” 

Doctorow estimates that, for a long time, he spent less time writing his posts than he did figuring out where they’d go. “And I made a lot of mistakes.” Now, he has a more automated system, but it still involves a lot of Python scripting, dozens of browser tabs, and far more manual work than most people will do to get their thoughts out to the world.

In a post-platform world, there might be an entire industry of tools to manage cross-posting your stuff all over the web. But we’re still living on platforms — and will be for some time. So for now, the best we have are tools like, a six-year-old platform for cross-posters. When you sign up for, you get your own blog (which the platform offers to connect to your own domain) and a way to automatically cross-post to Mastodon, LinkedIn, Bluesky, Medium, Pixelfed, Nostr, and Flickr. 

Manton Reece, the creator of, says he thinks of POSSE as “a pragmatic approach” to the way social networks work. “Instead of waiting for the perfect world,” he says, “where every social network can communicate and talk to each other and you can follow someone from Threads to Mastodon to Twitter to Facebook to whatever, let’s just accept the reality, and focus on posting to your own site that you control — and then send it out to friends on other networks. Don’t be so principled that you cut your content off from the rest of the world!”


One thing hasn’t figured out is the engagement side of things. Reece says he’s interested in building tools to aggregate and make sense of replies, likes, comments, and the rest, but it’s a much harder prospect. But this, too, might someday be an industry unto itself. Reece mentions a tool called Bridgy, which both allows cross-posting and aggregates social media reactions and attaches them to posts on your site. This will forever be a fight with the existing platforms, which largely have no incentive or tools for getting engagement data out into the broader web. But some folks think they can solve it.

When it comes to maintaining many different networks, Mullenweg thinks, ultimately, POSSE is a user interface problem. And a solvable one. “I’ve been thinking a lot about what’s the right UI for this,” he says. “I think there might be something like, the first step is posting to my blog, and the second step is I get some opportunities to customize it for each network.” Where POSSE has gone wrong so far, he says, is by trying to automate everything. “I’m really into this two- or three-step publishing process to get around this.”

POSSE is really just one piece of the new social puzzle. Before long, we might have a slew of new reading tools, with different ideas about how to display and organize posts. We might have new content moderation systems. We might have an entire industry of algorithms, where people compete not to make the best posts but to show them in the most interesting order. Modern social networks are not a single product but a giant bundle of features, and the next generation of tools might be all about unbundling.

When I ask Doctorow why he believed in POSSE, he describes the tension every poster feels on the modern internet. “I wanted to find a way to stand up a new platform in this moment,” he says, “where, with few exceptions, everyone gets their news and does their reading through the silos that then hold you to ransom. And I wanted to use those silos to bring in readers and to attract and engage with an audience, but I didn’t want to become beholden to them.” The best of both worlds is currently a lot of work. But the poster’s paradise might not be so far away.

Read the whole story
47 days ago
Santa Monica, California
Share this story

Mastodon Is the Good One

1 Share
Read the whole story
48 days ago
Santa Monica, California
Share this story

The Subversive Genius of ‘Far Cry 2,’ 15 Years Later

1 Share

He speaks with a familiar, unnerving sureness. “Men have this idea that we can fight with dignity, that there’s a proper way to kill someone,” says the arms dealer known as the Jackal, the central antagonist of Far Cry 2. “It’s absurd. It’s an aesthetic. We need it to endure the bloody horror of murder. You must destroy that idea, show them what a messy, horrible thing it is to kill a man.”

For all the gruesome, wince-inducing ways that virtual bodies have been designed to meet their demise, the idea that violence can create a bigger kind of mess is one strangely lacking in video games. Not so in Far Cry 2, which depicts the repercussions of killing on both micro and macro scales. Down an enemy in this open-world, first-person shooter, and they may not actually go down. Instead, they’ll convincingly writhe in agony before pulling their sidearm on you. Destroy an entire encampment, and another will simply take its place, respawning because so little—least of all conflict—can be solved with a gun. The world that Far Cry 2 presents, an unnamed African country ripped apart by civil war, is one in which violence seems to have its own libidinal energy. The only way out is to turn a weapon on yourself—to engage in an act of self-annihilation.

Released 15 years ago Saturday, Far Cry 2 is not an easy game to love. Its sub-Saharan setting burns with deep orange hues while also sinking into a swampy morass of muted greens and browns—evocative, if not straightforwardly beautiful. Neither is it a straightforwardly “good time,” but a brutal, sparse experience in which you’re suffering from the effects of malaria. In fact, very little is straightforward about Far Cry 2, a game whose simulationist mechanics paired with its hostile open world caused it to feel like a particularly intense fever dream upon arrival in 2008. It can be slow and tedious before then-cutting-edge fire technology, an AI friendship system, and reactive environments cause it to crackle into capricious life. Just as significantly, Far Cry 2 seeks to disempower the player rather than offer a soothing war hero fantasy, a point reinforced by the game’s grim, morally murky story. It’s as if every system in the game, including the story, is in a feedback loop with everything else. You’re caught in this maelstrom, just trying to survive.

Far Cry 2 boldly pointed toward an alternative future for the first-person shooter, a path that diverged from the jingoistic, popcorn spectacle of Call of Duty 4: Modern Warfare that had taken the world by storm a year before. For Austin Walker, former editor-in-chief at Vice Media’s gaming vertical, Waypoint, and now IP director at game studio Possibility Space, it was an “affirming” game—not only for him, he opines via video call, but for a cohort of “young critics and developers.” At the time of its release, Walker was 23 years old, living in New York, and working as a trademark researcher while attempting to make his way into games media. For him, Far Cry 2 was a watershed moment, imparting the “sense that you could do real thematic work in the first-person shooter that wasn’t just, ‘Rah, rah—I’m the guy with the gun.’”

This is 2008: Indie games were only just blowing up, and so any questioning of blockbusters was still mostly coming from the inside. “We [were] really bound into the triple-A space discursively,” stresses Walker. But here was Far Cry 2, whose propensity to provoke its audience aligned it more with the arthouse than the mainstream. No wonder Far Cry 2, the sequel to an interesting if hoky 2004 original, charted only 18th in NPD’s software-sales charts for October 2008, way below the likes of Fable II and Fallout 3. By 2009, the game had shipped 2.9 million copies, hardly a bust but far from the megahit that Ubisoft was perhaps hoping for, and which it had scored a year prior with the original Assassin’s Creed (which sold 8 million copies in a comparable time frame). As Walker says, “[Far Cry 2 didn’t have] many fans, but [it had] lots of interest from those fans. A real ‘everyone-who-heard-this-album-started-a-band’ kind of game.”

Chris Remo, who was the editor-at-large of Game Developer (formerly Gamasutra) at the time, and would go on to codesign Firewatch and Half Life: Alyx, recalls how the game’s release coincided with the launch of the Idle Thumbs podcast. Idle Thumbs (which Remo cofounded alongside fellow Firewatch and Alyx developer Jake Rodkin, and which also often featured another Firewatch and Alyx creator, Sean Vanaman) waxed lyrical about the game for years, coining as close to a meme—“grenades rolling down a hill”—as you’re likely to get for the emergent design that Far Cry 2 trailblazed. In the anecdote, Vanaman relates how his AI companion died in a blaze of fire caused by the explosion of his own grenade. It’s emblematic of a game whose systems, says Remo, were “unusually good at generating moments of extraordinary serendipity, tragedy, success, or any sequence of these things, sometimes very rapidly.” What effect did this have on the then 24-year-old? “I felt energized by what it was attempting,” Remo says. “And that significantly outweighed the surface-level challenge of playing it.”

Remo and Walker were far from alone in feeling activated by the game. A new wave of young game critics excitedly interrogated the game’s marriage of politics, hostile design (like its famously jamming weapons), and systems-driven gameplay, subjecting it to the kind of scrutiny reserved for only a hallowed few. One of those select titles was 2007’s BioShock, a creepy sci-fi shooter that doubled as a heady meditation on Ayn Rand’s philosophy of objectivism. Chief among the critics of that game was Clint Hocking, the creative director of Far Cry 2 himself, who in the years prior had built a reputation as a smart, considered public thinker on video games through his blog, Click Nothing. He referred to BioShock as a “disturbing” example of “ludonarrative dissonance,” arguing that its theme of Randian rational self-interest was at odds with its narrative, which charged you with helping another character.

Far Cry 2, then, was Hocking’s answer to the thorniest of conceptual problems, an attempt at creating a cohesive, unified vision of story and systems. In authoring this synthesis, he was putting his own critiques into creative practice (think video games’ own Paul Schrader), all while giving the most switched-on players an inside look at a medium mutating in real time.

BioShock may have been the title that allowed Hocking to articulate “ludonarrative dissonance,” but as he explains on a video call from Ubisoft Montreal’s office, the subject had long been on his mind. The creative director wanted to avoid any vagueness in a game that depicts “a particular kind of a conflict,” one that’s “pretty bleak and pretty sinister.” This was the mid-aughts: The battle of Mogadishu between the United States and Somali forces had taken place not much more than 10 years prior and the Iraq War was lurching from bad to worse, all while films like Hotel Rwanda, The Constant Gardener, and Blood Diamond explored, to various degrees, the effects of Western intervention on developing nations. “We kind of stepped into this,” Hocking says. “It was challenging because you don’t want to make disaster tourism, right? You don’t want to make a game exploring people’s misery and suffering for shits, giggles, and headshots.”

In March 2005, Hocking had just wrapped production on Ubisoft’s Tom Clancy’s Splinter Cell: Chaos Theory, a game on which he’d held three senior positions at once: creative director, lead level designer, and scriptwriter. He was broken by the process but nonetheless satisfied enough with the results to move forward (“I’ll never make a better one than this,” he recalls thinking). Ubisoft had published the first Far Cry and later acquired full rights to the IP, so Hocking switched franchises, assembled a small pre-production team, and started to hash out design ideas. “Very quickly, we decided we wanted to make an open-world, first-person shooter,” he says, stressing the scale of the challenge they were about to embark on: “That had never been done before.”

This genre hybrid—open-world, first-person shooter—became the foundational idea of Far Cry 2. The team looked to both 3-D Grand Theft Auto games and Bethesda’s landmark 2002 first-person RPG, The Elder Scrolls 3: Morrowind, for inspiration. But the task and design brief that Hocking and his team were wrestling with was unique, one that sought to go beyond the large, open, but nonetheless discrete levels of the original Far Cry and 2007’s S.T.A.L.K.E.R.: Shadow of Chernobyl. “Can you have a first-person shooter that isn’t a linear, authored, narrative, level-designed experience but that gives you the freedom of these games?” Put another way, is it possible to make a game that “has the momentum of a first-person shooter but takes place in an open world?”

In pursuing this unprecedented design goal, Hocking and his team at Ubisoft Montreal devised a new, stranger form of momentum. This is what struck Tom Bissell, author of Extra Lives: Why Video Games Matter and a writer for both games and television (including Gears 5 and the upcoming Andor Season 2). Typically, Bissell explains, first-person shooters would follow an intensity waveform: skirmish, skirmish, big battle, skirmish, etc. “You play Far Cry 2 and it’s not like that,” he says. “Just because you made it through a battle doesn’t mean that two more jeeps aren’t just going to roll up. … Suddenly, there’s 90 seconds of the most fucking incredibly intense conflict you’ve ever experienced in virtual form.”

What’s striking about Far Cry 2’s development story, as Hocking tells it, is how quickly the fundamentals of the game coalesced. Africa wasn’t chosen as the game’s setting because the team necessarily wanted to make grand proclamations about colonialism and interventionist foreign policy (other options included the Appalachian Mountains and central China). Rather, they were searching for what Hocking calls an “iconic” setting, something as evocative as the original’s box art, which depicted palm trees, a sandy shore, and blue water. As for mechanics, the “first play” (an internal vertical slice) delivered to Ubisoft executives at Christmas 2006 already contained a working “buddy system,” a handful of missions, jamming weapons, a wound animation, and the player suffering from the debilitating effects of malaria.

The narrative, the glue to bind all these potentially discordant elements, came together even quicker. “I wish it wasn’t so—almost—cliché,” Hocking says. “Even before we settled on Africa, it was apparent that the story of the original Far Cry is The Island of Doctor Moreau—it’s just a retelling. And once we decided to go to Africa, we immediately realized that The Island of Doctor Moreau is almost the same story as Heart of Darkness. … It was an obvious leap, and then we were able to relook at Apocalypse Now and reread the book. It was a kind of map for us.”

For all the laudable grit that Far Cry 2 brings to its conceit—namely, its depiction of civil war as a wretched and oppressive phenomenon—treating Africa as a “composite … a mélange of different places” is the aspect that holds up least for former journalist Walker. He says the game parallels Western media’s portrayal of African conflicts, never really explaining what the heart of the conflict is about (“Oh, they’re both just basically warlords,” he quips). Walker, however, commends the game for the areas in which it was miles ahead of its time—for example, the way it lets you choose from a diverse cast of characters, all of whom hail from countries touched by the pernicious hand of colonialism. More importantly, Far Cry 2 brought unambiguous “cynicism” to the idea that the player’s “presence here could ever mean anything good,” Walker says. “That’s not a perspective we’ve often seen elsewhere. Far Cry 6 is about you showing up as an outsider and helping a revolution. At the end of Far Cry 2, you save some people—but your presence is endemic of a disease that anyone like you is here at all.”

For as long as Hocking had made games, he’d been fascinated by the interplay of virtual space and narrative. As a youngster, he experimented with the level editor of the 1983 platformer Lode Runner (“painting with eight different pixels,” he calls it), saving his work using the primitive Famicom Data Recorder. “There was a cassette drive which you’d put a cassette tape in,” he says. “I would take an old Billy Idol tape, put some masking tape over the songs, and write over it.” Soon after, he started programming, making a handful of games for the VIC-20 home computer before getting into Dungeons & Dragons. This introduced a more intricate matrix of elements: complicated rules; a more freeform experience; a stronger sense of narrative; play that could careen in any direction. It’s precisely this dynamic, almost volatile cocktail of elements that continues to hold appeal for Hocking—“a really expressive thing for me,” he says.

But Hocking wasn’t just a child of nerd culture. Born in 1972, he spent the first few years of his life in Southern Ontario before moving to Vancouver. Like a lot of Gen X kids, he was raised by his single mom, who worked during the day and attended school by night, eventually scoring a job as a production accountant in the film industry. Before that, Hocking grew up “pretty poor.” He paid his way through college, starting with an undergraduate course in visual fine arts at Langara College in Vancouver, where he mostly studied drawing. (“Not like comic book drawing, but open, freeform, messy, artistic drawing,” he says.) By his own admission, Hocking was “terrible” at it, but the degree gave him a crash course in taking criticism, invaluable for the BA in creative writing he obtained at the University of British Columbia, which served as a springboard to a master’s degree. Hocking maintained his omnivorous creative ventures while working as a copywriter for web companies during the dot-com boom of the late 1990s. He was part of the Unreal modding scene, contributing a level to a mod called Strike Force (credited as “Clint ‘Cmdr_Greedo’ Hocking”). He made independent films. He even played in a punk rock band.

It’s tempting to read an abrasive energy similar to punk rock’s running through Hocking’s often confrontational work. The opening level of Splinter Cell: Chaos Theory contains a distressing torture scene. In Watch Dogs: Legion, released in 2020 (Hocking’s first game for Ubisoft after returning in 2015 following a five-year absence), you learn that migrants in the game’s dystopian version of London are being sold into slavery and harvested for organs. Hocking might be employed by Ubisoft, wholly embedded within the studio system, but he has a propensity to ask contentious questions through the two elements that have captivated him since he was a child: virtual space and narrative. Yet Far Cry 2 is different from these games: It rouses intense, physiological reactions—prickling hair, a rapidly quickening heartbeat, a thin film of cold sweat—through long-form play rather than discrete moments. More than any other game of Hocking’s, it’s holistic—and the noise it generates is often overwhelming.

When Far Cry 2 was released on October 21, 2008, it received admiring if not unanimous praise. For Eurogamer, Christian Donlan wrote that “Far Cry 2 is unforgettable rather than perfect; brilliant, frustrating, somber and comical.” Chris Dahlen asserted for Variety that “gunfights are brisk and unpredictable, but the mission framework falls short.” In an 8-out-of-10 review for Game Informer, Matt Miller said that “Far Cry 2 is one of the most ambitious game releases in years. … Sadly, it’s also plagued by a combat system that rarely elevates itself past basic gunplay.” Fifteen years later, Remo echoes the critical consensus, albeit more generously: “In a lot of ways, Far Cry 2’s reach exceeds its grasp,” he says. “But I think its reach is so interesting and compelling that even the ways in which its ambitions were not fully realized are themselves interesting. And when those ambitions are realized, it’s almost sublime.”

Bissell, who would go on to work on three Far Cry games for Ubisoft in the early 2010s (all of which were canceled), believes the criticism Far Cry 2 received upon release was tough for the development team to stomach: “I was explicitly told not to bring up Far Cry 2 overmuch when talking to my superiors [because]—and this is my read on it—the studio really loved what they’d made.” Bissell says Far Cry 2 was a “dirty word” within the studio, but not because the game was perceived as a creative failure. Rather, he suggests, the studio was suffering from “collective trauma” knowing it had made a game that was “special” yet wasn’t received by the majority of its audience “with anything that resembled recognition of its greatness.”

Still, Far Cry 2 was received rapturously by a small stratum of people. Another of its early champions was Ben Abraham, a 22-year-old critic who took it upon himself to play the entire game with a single life in response to one of Hocking’s blog posts. It was, Abraham explains, an exercise in getting into the game’s “headspace” more intensely, akin to “speedrunning for narrative.” He recalls how “scary” the playthrough was at first, but that he mediated the experience by utilizing “degenerate strategies”—long-range rifles, explosives. Thus, a sense of “boredom” began to set in, at least until Abraham hit the brutal difficulty spikes of Act 2. How did the intrepid critic record this endeavor in the era before Twitch and the video game live-streamer? Abraham wrote a nearly 400-page novelization called “Permanent Death,” which has been downloaded close to 30,000 times. Hocking described it as a “complete oddity.” For Bissell, the document stands as a “breathtaking exercise in taking love for a single game to an almost maniacal place.”

How, then, to assess the legacy of Far Cry 2 beyond the straight line that exists between the game itself and the criticism it inspired? It’s been argued that the plausible, simulationist mechanics of Far Cry 2 left their mark on the survival genre that exploded in the 2010s with the likes of DayZ. Another possible vector of influence is battle royale games like PlayerUnknown’s Battlegrounds, which seem to embody something of Far Cry 2’s tense, fraught, and emergent approach to combat (“the battle royale is a ‘grenades-roll-down-the-hill’ genre,” Walker suggests). There are also a few games that seem to more closely share Far Cry 2’s systemic, open-world DNA: Metal Gear Solid V: The Phantom Pain; The Legend of Zelda: Breath of the Wild; Death Stranding. But this might just be a case of convergent evolution—there are, after all, a lot of ways to arrive at expansive, systems-driven gameplay. Viewed from another angle, the game’s influence can be said to extend to players for whom it crystallized what they wanted out of a game. Walker loves Breath of the Wild partly because Far Cry 2 “pushed” him in that direction.

We can be more specific, however, and point to Far Cry 2’s influence on the work of a few designers. Harvey Smith, creative director of the Dishonored series, enthused about the game for Penny Arcade in 2012 and described the recently released open-world shooter Redfall as “what you’d get if you blended the Arkane creative values with Far Cry 2.”

Remo, meanwhile, is unequivocal about the effect that Far Cry 2’s “uncompromisingly first-person nature” had on his own Firewatch, a first-person drama set in the hills of Wyoming. Take, for example, the map in Firewatch that works just like the one found in Far Cry 2. Your character pulls out and holds in-game objects in such a way that they take up the vast majority of the screen; the game doesn’t pause; the map is diegetic, not viewed inside a menu. Firewatch’s walkie-talkie dialogue system works similarly, with conversational decisions playing out on the fly as you traipse about the wilderness. Remo declines to call the approach an overarching “philosophy.” Instead, he describes it as a “method of design thinking” intended to ensure the game remained “grounded.” The goal was simple yet strict: refrain from breaking the player’s “immersive viewpoint.”

But Firewatch and Redfall are edge cases, rare attempts at internalizing and developing Far Cry 2’s experimental design principles. It’s the opposite of a slight to say that the game remains one of the most daring titles ever made at blockbuster scale. “What I like about Far Cry 2 is that it tried to create a bunch of its own conventions,” Bissell says. “What’s interesting is how few of them, despite being elegant, interesting, and audacious, caught on in the wider design community of games. That is an achievement in and of itself.”

Not even the Far Cry series itself (which boasts in excess of 50 million unit sales since Far Cry 2) has run with many of the design ideas laid out by the second installment, beyond the open-world, first-person-shooter structure. Each of the four subsequent mainline entries, none of which Hocking worked on, has “sanded down” the spiky, subversive magic of Far Cry 2, says Walker, as if Ubisoft asked itself: “How do we make this our Modern Warfare? How do we make this our huge breakout first-person shooter?” It would do so by forgoing the disempowerment part of the power fantasy. To play a recent Far Cry is to engage with a clear upgrade path and enjoy supersoldier-esque powers, like the ability to tag enemies (thus basking in the knowledge of their location at all times). When you open the map, the game pauses, and when you set a waypoint, giant holographic arrows show up in the virtual environment, leaving practically zero chance of getting lost. Crucially, says Walker, the narrative has shifted from one where you play as a “morally questionable, bloodthirsty mercenary” to an “average Joe who’s gotten swept up in something bigger than themselves.”

For better and worse, the series has come a long way, and its current identity is best summed up by Sandra Warren, the new vice president and executive producer of the Far Cry brand (who also worked as lead animator on Far Cry 2). “We want you to go on holiday and basically throw away all the travel books you can think of, get out of the touristic landmarks, and discover the eeriest things a location has to offer,” she writes via email. “It will make you adventurous, uncomfortable, free, afraid at times. And no matter what, if you survive, when you come home, you will have absurd stories to tell your friends.”

In these terms, the Far Cry series has metamorphosed into precisely the “disaster tourism” that Hocking set out to avoid. In Far Cry 5 and 6, real-world issues like the rise of fundamentalism in the U.S. and revolution in Central America are mobilized in service of something that curves closer to conventional video game “fun.” Certainly, we’re a long way from games that show players “what a messy, horrible thing it is to kill a man,” and not just because the death animations in these games lack Far Cry 2’s macabre eye for detail.

Now, the franchise features snappy, feel-good gunplay of the Modern Warfare school, while its tongue-in-cheek storytelling is firmly rooted in the postmodern mode popularized by Grand Theft Auto. One can understand Ubisoft’s swing for a broader tone in its pursuit of the sales that make the economics of these hugely (and increasingly) expensive entertainments feasible. During his short stint at Ubisoft Montreal in the 2010s, Bissell caught a glimpse of the monumental labor, and thus the monumental stakes, of such productions. “It’s the only game studio I’ve ever been in that had a Starbucks in it,” he says. “It was so big, even then, it didn’t seem possible. I looked in the Assassin’s Creed room and there were 500 fucking people.” At that moment, Bissell saw the direction of blockbuster game development—the way it was becoming “potentially unmanageable.” It would only get more unwieldy: “What seemed like big teams at the time just became colossi of largeness.”

This is precisely the environment Hocking works within today. He lists some of the elements that make up a modern open-world blockbuster: collectibles, skill trees, inventory management, a small country’s worth of non-player characters to talk to, main quests, side quests, vehicles. “It’s just fucking enormous, and so to build on top of that, to progress that forward, you have to have templates that you build with,” he says. “It’s challenging for me because I’m a person who would prefer in an ideal world to always cut everything from full cloth. But that’s just not a reality of modern, triple-A game development. You don’t cut anything from whole cloth.”

Far Cry 2 was made just before this blockbuster horse truly bolted. It’s the product of 150 people rather than 1,500; it took three years to make rather than six; it was made on a single continent rather than many (and without any outsourced labor). Hocking is upfront about the kinds of games he enjoys making (those with “big reach and big budgets”), and the upcoming Assassin’s Creed: Codename Hexe (reportedly set in Central Europe during the 16th century) may be his highest-profile game to date. He has little choice, then, but to adapt to the new, supersized conditions of production while noting, with a little wistfulness, the passing of an era in which blockbusters could take “chances and risks … when you could really bring a high concept to a triple-A game.” The year or two surrounding Far Cry 2’s debut brought the releases of Assassin’s Creed, Grand Theft Auto IV, BioShock, Dead Space, and Mirror’s Edge, games of both considerable ambition and a system of production that was yet to spiral out of control. At the risk of indulging in regressive nostalgia, there’s perhaps a case for thinking about these years as something of a golden age for innovation in blockbuster video games.

On X, Hocking refers to himself as “usually the most cynical person in the room,” but there is no doubting his sincerity when talking about what he and his team achieved with Far Cry 2. “I sometimes lament privately in my darkest hours that it may be the best game I ever make in my life,” he says. “I’m very proud of it. I think it’s a very good game. I think it’s an important game.”

Perhaps more significant than even the game itself is the nature of the conversations it’s sparked, the back-and-forth between audience and creator that Hocking holds dear and that’s been the invisible lifeblood of the projects he has steered at Ubisoft for the best part of 20 years. “That’s what I need. I guess some people get their reward from having sold 50 million copies of something, and that’s great, but having only sold 50 million copies doesn’t mean anybody liked it. It doesn’t mean that the game changed anybody’s life or their perspective. It doesn’t mean that I’ve communicated.”

“Communication goes two ways,” he adds. “If it’s just broadcast—I make a thing and 50 million people play it—I can just fucking chuck my game into the sea and say, ‘Look, there it is.’ But it’s when it comes back to you, and you understand how people felt, and not just, ‘That game’s fucking wicked: 10 out of 10.’ If they can talk about what was important to them, what moved them, what changed their perspective about the world, this kind of conflict, these kinds of people, or this kind of play experience, that’s what matters. It’s the echo, right—the feeling that you’ve meaningfully contributed to someone’s experience. That’s why we make things.”

Lewis Gordon is a writer and journalist living in Glasgow who contributes to outlets including The Verge, Wired, and Vulture.

Read the whole story
48 days ago
Santa Monica, California
Share this story

This is 52: Neko Case Responds to The Oldster Magazine Questionnaire

1 Share
From the time I was 10, I’ve been obsessed with what it means to grow older. I’m curious about what it means to others, of all ages, and so I invite them to take “The Oldster Magazine Questionnaire.”
Here, musician, music producer, and writer, and newsletter writer responds. -Sari Botton

Subscribe now

Neko Case. “A photo on tour below my birth year.” Photo by Mike Bulington
Neko Case’s bio, in her own words: “American Musician, Music Producer, Writer, Ding-Dong” (Editor’s note: You guys, she’s a rock star.)

How old are you? 


Is there another age you associate with yourself in your mind? If so, what is it? And why, do you think?  

19. I think it’s because I always have the feeling like I’m at the brink, or the crossroads of something happening. Like I’m that age that’s about to go off to college or into the military and anything could happen. 

I don’t recognize myself sometimes. I think that’s to do with perimenopause, which is only acknowledged by about five people. I feel like a bit of a werewolf. 

Do you feel old for your age? Young for your age? Just right? Are you in step with your peers?

I feel very young for my age. I’m in step with older and younger people but I feel like I have a real shortage of peers other than my bandmates. Likely because I live in a rural place and my job doesn’t allow for a lot of socializing. ( Very unlike the myth of being a musician. Haha!) It’s a TON of work so there’s not a lot of extra time lying around. 

What do you like about being your age?

Having the confidence to say “no,” and really meaning it without guilt or trepidation. (I even experience a bit of thrill now and again. Haha!)

What is difficult about being your age?

I don’t recognize myself sometimes. I think that’s to do with perimenopause, which is only acknowledged by about five people. I feel like a bit of a werewolf. 

I’m not experiencing age-based “feebleness” I mistakenly associated with being my age back when I was 25. That was a bit of an age-biased myth I believed.

What is surprising about being your age, or different from what you expected, based on what you were told?

I’m a stronger singer at 52 than I have ever been. I think it’s because I have paid attention to my body and know what I need to do to keep it going. Thank goodness! I shout it to everyone I know who plays music. You don’t have to retire at 40. Granted, I do realize I am privileged to be in good health, and do not hold myself above other people for it. Anything could happen at any time.

What has aging given you? Taken away from you?

I have come to terms with my very poor education in regard to money and staying afloat. I am ill-equipped and I have gone through so many catastrophes I’m now very interested in learning how to go slower and set myself up. I have a very bad relationship with money because I don’t respect it. As in, I think it’s corrupting. I am ready to find a way to “respect” it in a way that does not compromise me or my values, but I will hopefully know where my next meal is coming from? Does that make sense? 

Being 52 has really shone a spotlight on how much of my life was wasted in the tractor beam of the male gaze. I mourn that loss of time and experience and freedom like mourning the passing of a loved one. 

How has getting older affected your sense of yourself, or your identity?

I care less about my appearance, which is something I pretended to do for so long! And I find my physicality much funnier. Human bodies have always been funny to me in an affectionate way. The things we expect of our bodies is ridiculous. Now that the curtain is kind of falling away it gets funnier and I love myself more, like my body is an older dog who is my favorite companion. I treat my body with much more affection.

What are some age-related milestones you are looking forward to? Or ones you “missed,” and might try to reach later, off-schedule, according to our culture and its expectations? 

I didn’t get to have a 50th birthday party. I’m not one for birthdays but that would be a nice party to have. 50 years on this Earth is really a gift! There would be cake and pie!

Being 52 has really shone a spotlight on how much of my life was wasted in the tractor beam of the male gaze. I mourn that loss of time and experience and freedom like mourning the passing of a loved one. 

What has been your favorite age so far, and why? Would you go back to this age if you could?

35 was really good. I was in good physical shape and doing all kinds of things. I wouldn’t go back though because 35 was a small island between really hard lessons and pain I’d just have to go through all over again. I definitely would not end up where I am now which would be a tragedy as I have such a great partner and a daughter whom I love so very much. I ended up with a real family, which I would not trade for anything. 

“At work.”

Is there someone who is older than you, who makes growing older inspiring to you? Who is your aging idol and why?

The women in my neck of the woods in eastern Vermont are professional agers. They do it with great style and can do more chin-ups than I will ever be able to. Well, at least until I’m 70… I have something to work toward! I don’t have a particular aging idol but I have noticed that people with animals or a gardening habit often keep their activity higher and that is just so good for you and your body and your dog or horse or llama. People are meant to interact with the world, we aren’t just some blessed species for whom all else are secondary. We are healthiest when we are a part of the whole. We need and are needed by the earth, this reciprocity is crucial. 

Also: Angela Bassett!

(Editor’s note: I stumbled upon this conversation about songwriting between Rosanne Cash and Neko Case at Rodney Crowell's "It Starts With a Song" Songwriting Camp on August 27, 2022, in Nashville, TN, and I had to share it with you:)

What aging-related adjustments have you recently made, style-wise, beauty-wise, health-wise?

I dye half of my hair. I like the silver at the front, but I missed the red so I compromised. I don’t really know where I’m at style-wise. I feel like I have never really “found it.” I have always been a bit liquid in that regard. It would be a hell of a lot easier if I did have a particular style! It took me 42 years to figure out I hated wearing dresses! I have always eaten pretty healthy, but staying off sugar and bread seem to do me the most good. I have a weaker resolve since covid though, so I have to build it back up again. 

“At work.”

What’s an aging-related adjustment you refuse to make, and why?

I refuse to say I’m “old,” even if I’m just joking around. I have not earned that. It is a term of respect used as a slight and I want to use it sparingly and with the gravity it deserves. I am not yet an elder, but I sure hope to be a good one one day. 

The women in my neck of the woods in eastern Vermont are professional agers. They do it with great style and can do more chin-ups than I will ever be able to. Well, at least until I’m 70…

What’s your philosophy on celebrating birthdays as an adult? How do you celebrate yours?

I don’t really care too much about birthdays. I’m more of a celebrate good things as they come person. But as I said before, I still plan to have a 50th birthday party. Maybe when I’m 60 I’ll have time. Haha! 

Oldster Magazine is a reader-supported publication that pays contributors. To support this work, become a paid subscriber.

Read the whole story
55 days ago
Santa Monica, California
Share this story
Next Page of Stories