Part 3: Conclusion
Welcome back! So far we’ve established that the auteur theory posits that a single visionary drives the artistic success of and takes the credit (e.g. answers) for a work. We’ve also decided that Cage is not the auteur in shining armor the gaming industry’s been waiting for. But that begs the question: if not Cage, then who?
We might first look at indie developers, whose small studios allow them to become more well-known names in the industry. For me, the person who immediately comes to my mind is Phil Fish, one of the developers of Fez and the man you either know from a) Indie Game: The Movie, b) that Innuendo Studios short, or c) haters on the internet.
Phil Fish is not the only person who worked on the game Fez. Shawn McGrath brought the initial idea to the table but left the project because of a conflict of vision, after which Renaud Bédard came on as a programmer for the game. Yet only Fish gained a notable level of celebrity from the project itself, the documentary coverage, and his own unflinching outspokenness.
If you’re readying your typing fingers to let me know that Fish is self-centered, overly critical, or a huge douche, never fear–I am definitely aware of his reputation. I would argue, however, that the likability of a developer should not factor into whether or not they play the auteur role. Instead we might look at how developer-player interaction, or more specifically how gamers’ perceived ownership over things they play, affects the development and artistic direction of games.
When we watch a film, even if we’re thinking critically about the costuming or the performance of the actors, going to a movie is a mostly passive experience, an opportunity to ‘veg out.’ When you play a video game, on the other hand, even the most repetitive tasks you’ve done a thousand times before require some level of engagement (as anyone who has tried to eat their dinner while playing can attest).
What sets gaming apart as a creative medium is its interactivity. Cage’s ideal of a story-based, innovative work can apply to anything from a song to a poem to a comic, but in order for something to be considered a game most agree it should involve some level of player agency, which comes with a necessary feeling of connection to and power over the story.
Consequently, when a developer’s vision or persona deviates from player expectations, players react differently than movie-goers or music fans might. In fact, sometimes that feeling of ownership fosters entitlement that can manifest in not just hurtful but also dangerous ways: for better or for worse, Phil Fish is no longer in the industry.
And sometimes, as seems to be the case with Half-Life 3, developers’ fear of player reactions prevents pieces of art from ever making it to market. (For the record, I don’t think that means we should stop being critical of games or lower our standards, although we may need to rethink the way we convey those criticisms to those who create and produce the games.)
So is expecting a single person to answer for the entirety of a game, especially one much larger than Fez like Half-Life or Mass Effect, beneficial or even reasonable? Games are traditionally made by studios, teams of people working together on project after project. This differs from films, which are similarly made by crews, but by crew members who then go their separate ways after the release.
Yet studios themselves grow and change, shifting workers from one project to another, sometimes giving away projects altogether; Call of Duty has moved studios three times since its inception, while 13 different studios have contributed content to Halo games over the past 15 years. While many are known for certain types of games, few have a singular style as distinct as the ‘auteurs’ of old. And I wouldn’t say that’s a bad thing.
Cage has said in a lecture for the British Academy of Film and Television Arts that he looks forward to the day when game designers can use camera algorithms to mimic the cinematic styles of Scorsese or Tarantino, capturing emotion in a way that he feels is superior to the traditional perspectives of game cameras. But simply imitating the shots of a famous director does not create a meaningful story.
Beyond that, game developers do not have sole or even primary control of the camera. That tends to belong to the player in most games, and players can shape the story in other ways too, from character creation to dialogue choice to world building in games like Minecraft. Yet players certainly wouldn’t be given credit for the music in the Elder Scrolls games or the art direction of Life is Strange.
Which makes me think that we’re asking the wrong questions about games. With roots in interactive fiction, tabletop, and many many other forms of media, video games resist the categorization that film criticism can offer. They do not need auteurs to achieve artistry, and our efforts to transplant film theory onto game analysis is neither as simple nor as productive as many, Cage included, seem to think.
As a form of art, games are academically and critically under-examined, but that’s part of what makes them so exciting. The industry is rapidly changing and the potential for creativity between player, actor, designer, programmer, etc. is huge. And as games continue to evolve, this potential for growth and innovation extends to game criticism, journalism, and analysis.
Thanks for reading! Where do you think gaming, and game analysis, is headed? Do you believe auteur theory holds up? Let me know what you think in the comments.