When GTA V came out, many of my friends raved about it saying it was one of the best gaming experiences they’d ever had and that I had to play it. Eventually I got round to playing it, and the first thing I noticed the moment I picked up the controller was “Damn, this is more laggy than a 300MS ping in a match of CS:GO”, and it’s not the first time I’ve done that either. I’m used to playing BF3 at well over 100FPS (even if I only see it at 60Hz) on my PC and it runs butter smooth, but when I played the Xbox version it was almost unplayable due to the frame rate and responsiveness. As someone who games on the PC and has spent many years playing on consoles in the past with less trouble, I have to wonder what on Earth is going on in the game development world.
Firstly, I see the appeal of consoles to those who own them, a nice box that you can just pick up, plug in and play with an even playing field on – due to the use of a controller. But just because they play on a console, should they have to settle for 30FPS and 900P? We don’t on the PC. But Ubisoft seems to think so as does Bethesda now too. It was recently announced that both Evil Within and Assassin’s Creed Unity will not run at 1080P 60FPS on consoles, and indeed upon their release they do not. Assassin’s Creed Unity will run at just 900P at 30FPS, whilst The Evil Within will run at 30FPS at a 2.35:1 aspect ratio (Nothing has been said about Unity on PC but Bethesda will allow the changing of the aspect ratio and FPS via console commands). But this isn’t what bothered me most, it was what Nicolas Guérin (The World Level Designer on Assassin’s Creed Unity) said in an interview with Tech Radar:
“At Ubisoft for a long time we wanted to push 60 fps. I don’t think it was a good idea because you don’t gain that much from 60 fps and it doesn’t look like the real thing. It’s a bit like The Hobbit movie, it looked really weird.” “And in other games it’s the same – like the Rachet and Clank series [where it was dropped]. So I think collectively in the video game industry we’re dropping that standard because it’s hard to achieve, it’s twice as hard as 30fps, and its not really that great in terms of rendering quality of the picture and the image.”
And the Creative Director (Alex Amancio) said that “30 was our goal, it feels more cinematic.” And most mind blowing of all that it “feels better for people when its at that 30 fps.”
The problem with their post is it’s very contradictory, they’re saying they wanted to push 60FPS, so they know it’s something we want, but then they’re saying that 30fps is better and that we like it more. But if we liked it more and we wanted it, then surely 60FPS would have been dropped years ago, we wouldn’t have GPU reviews seeing if we can keep games above the 60FPS baseline and nobody would complain when a game was released at 30FPS.
The main argument I see for 30FPS, is that it’s closer to the movies, and this quite frankly, is a terrible argument. The fact of the matter is that a film is recorded, it’s mastered at that speed and has a lot of motion blur added to it, and it never changes from that speed either. You also don’t interact with it which is actually what makes most of the difference in my opinion. We saw at the end of the last generation of consoles that they couldn’t keep a steady 30FPS, yet alone 60, and this variation in frame rate is something that you notice a huge amount more with a lower frame rate. If you drop 5FPS at 30FPS, you’ve lost a full 6th of your frame rate and it’s very noticeable (in a lot of cases you’ll feel like your game has actually slowed down), whilst at 60, you’ve only lost 1/12th, and it’s much less noticeable, so you’ve got a lot more breathing room in particularly action-packed areas, and when your frame rate varies it doesn’t have as much of an impact on the immersion. The other reason we want our games at 60FPS is because anything lower and it’s just not smooth, it becomes jerky and fast movements don’t transition smoothly across the screen – it’s simply a nicer and more immersive experience playing at a higher FPS. If you don’t believe me, cap your frame rate to 30FPS after playing at 60 for a while, the game will feel very different.
I also noticed that Ubisoft can’t do 60FPS because it’s “twice as hard as 30FPS”. I see the logic, I really do, you can choose graphics or FPS, if you have a high FPS then your game will play better but is that going to make any difference when you’re advertising your game? No. Graphics on the other hand will sell your game as it allows you to show off screenshots and in-game trailers and rendered trailers and so on, but that doesn’t make it the right choice – it’s short sighted. Sure you’ll make more initial cash, but when word gets out of the 1080P 60FPS game that you’ve made (instead of a lower resolution and frame rate one that you made instead) that runs smooth as butter and starts selling like hotcakes, then you’ll find that it’s a very worthy trade off, because at the moment it’s a big topic of conversation. Reduce your texture quality, turn down your AA a bit and bump up the FPS, just look at PC gamers for example, they will almost always turn down the graphics in favour of a smoother experience.
I guess the overall point I’m trying to make here is that we want our games to play well more than we want them to look good, and one of the key parts of that, especially on the PC is a high and consistent frame rate (remember OR is going to need high frames to be smooth to use on PC, and it’s rivals will need the same on their machines). But what you think is what’s important here, could you live with gaming at 30FPS, or is 60FPS an absolute must for you? And if developers begin to run more of their games at 30FPS, what do you think the effect will be on the PC ports of future games? Let us know.