For those not familiar with the term, “dogfooding” is the idea that when you produce a product you should “eat your own dogfood” and use the product you make. It appears to have originated in software development, and is a particularly common exhortation in the video game industry. Companies frequently ask their employees to play their games regularly, even when not on the clock, and a lot of companies with live games make significant play of that game a major factor in hiring decisions (some go so far as to tell applicants that don’t play the game to not even apply).

I had recently wondered whether this was only common in the world of core games: I asked friends whether they thought that developers of Barbie games, for example, were expected to dogfood. When you provide the universe with a question like that, it seeks to provide an answer, and within a week I’d met someone who had been a designer on a Barbie game. Yes, they were expected to dogfood. The entire team was exhorted to “be Barbie” in all of their design decisions.

The idea seems to be pretty obviously good, on the surface, particularly in the applications-development world in which it originated. If developers on Microsoft Word are secretly using WordPerfect at home, that says something pretty problematic about the utility of the software. But I’m not convinced it’s as universally laudable a goal in game development. Before I go into my reservations, I’ll list a few areas where it is a great idea, if used for a particular purpose:

  • Decision makers (producers, leads, and anyone else that can allocate development resources) should play the game at least casually, particularly during betas and major feature pushes. They should attempt to see the game as a player would, and also try out anything that’s suggested to them internally as worthy of their attention. They’re not necessarily playing the game to get ideas for things that should change, so much as to experience pain points that their employees have been trying to get resources to fix. Often, bugs and feature requests that seem low priority when you’re not playing the game escalate precipitously when you are. Playing the game can serve to align the expectations of players, designers, and decision makers.
  • Anyone at the company should play parts of the game relevant to a feature/content he or she is working on. When a task in on your plate, it’s often been flensed of context to make it easier to implement. But that context is still important in how the addition is going to be perceived in the game, and may control how you configure the feature for future expansion. For example, if you’re adding a new creature and thinking about whether to give it a bunch of knockback abilities, it might be important to know how common pits and cliffs are in the areas that creature will appear.
  • Everyone should obviously play their content to make sure it’s free of obvious bugs before turning it over to QA to look for the subtle ones.
  • When players are complaining about something, it’s worth trying to get as many internal eyes on it as possible to see for sure whether the players are right (or are just squeaky wheels; I’ll talk more about this in a later post).

But, all those in mind, I’m not so certain about the general idea of dogfooding, insofar as you should be playing your game all the time, even with no specific ends in mind. There are a few really obvious reasons, mostly related to how video games are not the same as software applications:

  • Games don’t have a clear competitor in most cases, and definitely don’t have a clear use case. It’s embarrassing if the MS Word developer prefers WordPerfect, and it’s strange if he does all of his writing in Wordpad, but, meanwhile, it’s less embarrassing if the Smite designer players a lot of League of Legends (he may prefer the setting/fiction) and not even all that odd if he doesn’t feel up to playing MOBAs in his spare time.
  • Games can require a lot of time, and are ostensibly entertainment. Particularly when someone is off the clock, it’s a little peculiar to expect them to relax by playing the game they’ve been working on for at least 40 hours a week (usually lots more). They probably do not ask the NCIS cast and crew to spend an additional dozen plus hours a week watching reruns of the show as their leisure activity. If you produced a product of a type that your employees were going to use anyway, you could expect dogfooding. But in the games industry it tends to feel a lot like assigning your employees lots of extra work for which they are not being paid. (If you expect them to play the game during a set number of hours in the week for which they are being paid, great! That’s actually a fine way to do dogfooding.)
  • Game developers are not typical users, and it can be dangerous for them to play the game pretending that they are. This last point is obviously my central one, and I’ll unpack it more for the rest of the article.

I know very few game developers that play games like a normal user, particularly on games they’re working on. Everyone that does the job for a living has some level of running commentary when playing a game about things they would have done differently, and this can be very difficult to turn off to just enjoy the game. Artists are constantly critiquing the game’s art, people who work on missions and story are always second guessing the unfolding of plots, those that work on systems are often offended by “choices” that are just exercises in finding the one correct option, and don’t even get me started on UI designers confronted with someone else’s UI.

And this is just the kind of thing that happens to you when you’ve been doing it as a job; just like people that work on films watch them much differently than normal viewers, the process of working on games changes your priorities and makes you hyperaware of the craft that went into the art. But on top of that, many developers are unusual even before adding in the years of behind-the-scenes knowledge. These are people that were so passionate about games that they decided to make them a career instead of the many more lucrative things they could be doing. They are very likely to have stronger opinions about things than a normal player.

There’s some good argument that you should make a game for one person, so at least you know that someone will find it fun (and hopefully more people like that person will also find it fun). But, importantly, that one person shouldn’t always (possibly ever) be you. If games were only made for game developers, there would be a lot of audiences (largely composed of the world’s less obsessive people) left without games. And when your company forces you to play your own game, ostensibly for fun, it provides a subtle pressure for you to figure out how to make the game fun for you (to preserve your own sanity), and you could easily lose sight of the fact that things you change to improve your own enjoyment might make it less entertaining for your actual audience.

But what about the companies with live games that only hire existing fans? Surely that solves the issue, since you’re getting an employee that is already part of your audience? Not necessarily. Remember, developers are obsessively passionate. Even if they love your game, they probably love it for a different reason than your average player. If their fannishness is manageable, expect some ramp up time where you have to explain to them why parts of the game they dislike and never use are worthy of development time. Worst case, they could quietly and unintentionally work to bend the game to favor the niche they were part of, reducing the fun for the majority of your players.

Ultimately, my worry about dogfooding in the game industry is that it sets up an expectation that you should only work on games you find fun, and you should work to make your game as fun as possible for you. But games, particularly large games, are targeted at an audience wider than yourself, and I feel like a fundamental skill as a developer is to be able to make things that aren’t for you, but you can still understand how to make them for someone else. I’d much rather work with someone that can make a good case for doing something based on metrics, comparison of multiple sources of user feedback, and established best practices than someone who plays the game regularly and is really passionate about his or her own experience.

There are lots of good reasons to play your own game. And if you’re lucky, it’s even a game that you, yourself, find fun to play. But there shouldn’t be shame if it isn’t, and narrowing down your staff as policy to those that love to play the game or can fake loving to play the game has a lot of potential pitfalls. As an art, you should absolutely make games you’re passionate about and love. But, as a profession, I object to the stigma that if you don’t love what you’re making you can’t work on it. Sometimes a game job is just a paycheck, but that doesn’t mean you won’t use your skills to make it the best game for someone else that you can.