Always On: Off With Its Head

You know what drives me nuts? Trips to the local drugstore to pick up blades for my razor. Invariably, these blades are locked behind a flimsy looking piece of plexiglass that can only be opened by some pimply-faced teenager with a set of keys. Usually there’s a sign nearby that reads something along the lines of “This product has been secured for your convenience.”

The sign on that case is to shopping what always on requirements are to single player video gaming: a steaming load of horseshit. Those overpriced razor blades aren’t locked up to help the consumer; they’re locked up because jerks like to steal them, and that hurts the company’s bottom line. Claiming otherwise–that said impenetrable lock guarded by a difficult to locate and usually less-than-friendly employee is of benefit to my razor blade purchasing experience–is disingenuous at best and a dirty lie at worst. It is not for my convenience; it wastes my time and pisses me off, ruining whatever tiny benefit there might be to my wallet in reducing lost costs due to shoplifting. It’s too much of an imposition on the customer, and it’s driven many of us to shop elsewhere instead.

Always on requirements for single player games work the same way. Any potential tangible benefit to the customer is quickly rendered moot by rampant inconvenience. Internet connectivity is reasonably reliable in urban areas, but it becomes sketchier the further one lives from a big city–at least here in the states. Reliable connection or not, many recent releases have proven that video game developers and publishers can’t be trusted to build and maintain reliable servers. The launches of SimCity and Defiance were plagued with lag, long wait times, and enough patches to build a whole new game. Even Blizzard–a company that really ought to know how to deal with server connectivity, given its history with World of Warcraft and–couldn’t keep its servers working properly for the release of Diablo 3.

The cloud may be the future. It may also be the next bubble that sends us screaming back to local storage and processing while tech pundits rage against those who thought a return to the mainframe days of yore was a good idea. Regardless of the cloud’s ultimate fate, there’s one cardinal rule to remember about every hip new technology: applying it to everything you possibly can willy-nilly is stupid. Technology is a tool like anything else, and different tools are good for different jobs. If you’ve got a drill and a hammer, you wouldn’t use a drill to pound in a nail just because the marketing department is drooling all over itself about how much cooler and newer the drill is. You’d just use the fucking hammer. Likewise, there’s no need to force cloud saves given how reliable local storage technology has become, especially in a situation where a gamer is never going to use another device to access that save. There’s no need for server-side processing of a single player game given the beastly pieces of equipment that sit atop most of our desks. And there’s no need to constantly check if a game is legit when simpler tools like one-time registration codes will do the job just as well at a fraction of the price and annoyance to the consumer.

Here’s what it all comes down to: annoying the people who want to give you their money or who have already happily paid for your product is one of the best ways to sabotage your own business. The cloud’s great and all, but if I want to play a single player game by myself, I should be able to do so wherever and whenever I want. Removing functionality is removing functionality regardless of what pretty language you dream up to try to make it palatable to the masses. Excising a feature that’s been essential to in-home gaming since the medium’s inception that hasn’t proven obsolete or broken is a dumb move that’s just going to piss off a large percentage of the customer base. Cloud features in and of themselves are fine where they make sense and provide an actual benefit to the consumer, but  a single player game should never be rendered unplayable due to connection issues. If a single player game’s cloud services break, gamers should still be able to play it without those services.

Unfortunately, I’m not convinced the people who make decisions about the direction of the gaming industry understand or care about any of this. All they see is the money they think they’re losing to piracy and the used games market. There’s a gaping hole in that logic, though: what makes industry bigwigs think that each and every game pirated or bought used would’ve translated to a full-priced sale had those two cheaper options not been available? It’s impossible to prove that Joe Blow who downloaded a crack would’ve spent $60 to buy a game new if said crack weren’t available. If Joe Blow doesn’t have $60 to give you, he doesn’t have $60 to give you. If Joe Blow has $60 but doesn’t think your game is worth it, he isn’t going to give you $60 for it. That doesn’t excuse Joe Blow for taking the piracy route by any means if he chooses to do so, but it opens up a gaping wound in the “we need always on DRM to protect our sales from the pirates!” argument important people with important titles seem to love so much. One pirated copy does not always equal one lost sale.

There’s nothing wrong with protecting your investment, but there are right ways and wrong ways to do it. Lying to your customers about what you’re doing and why you’re doing it is wrong. Inconveniencing your install base is stupid. Implying that anyone who doesn’t accept your ridiculous DRM is ignorant is childish and short-sighted (I’m looking at you, Adam Orth). If developers and publishers want to include copy protection in their games, they should find a way to do it that’s as invisible to the paying customer as possible. Anything less is just uncivilized.

Featured Articles:

Tags: , ,

Around the web