Google-apps
Hoofdmenu

Post a Comment On: C0DE517E

"Tell the internet that you're not a moron..."

21 Comments -

1 – 21 of 21
Anonymous Anonymous said...

For a game that already provided 60Hz feedback to the user, I'm frankly quite surprised why so many people thought internally that halving the frequency of visual feedback was a good idea. It's a fighting game, for goodness' sakes. Visual feedback is crucial for reaction/timing for the player.

March 7, 2011 at 10:02 PM

Blogger DEADC0DE said...

Read what I wrote. If you still think you know better, read it again.

P.S. the visual feedback is measured by its latency from the input, not the rendering frequency. I can render at 2000000fps, then have a 2 seconds buffer between the input and gameplay, and still the game will have a 2 seconds lag, quite obviously.

March 7, 2011 at 11:56 PM

Anonymous FreakyZoid said...

That's okay, I assume the internet is a moron (especially the bits that are vocal about changes in games, as you mention), so we're even on that score.

March 8, 2011 at 1:23 AM

Blogger Poidokoff said...

Almost the same thing as with the C++ discussion. People come in swinging, mostly based on their emotional knee-jerk reactions without ever stopping to consider the ratifications for their opinions, or bothering to listen differing views. I think it's got everything to do with a cognitive bias called Illusory Superiority.

Personally I don't care about the FPS as long as the gameplay feels fluid and fun (and I'm not viewing a slideshow). Fight Night happens to be a series that renewed my faith in that genre of games. It looks and feels awesome and is fun to play. Kudos to the makers.

March 8, 2011 at 5:40 AM

Anonymous MJP said...

When it comes to the denizens of the Internet who like to talk about graphics, they tend to latch onto any numbers they get. Because everyone knows that if some number is higher then it's better, right?

Framerate obviously gets this treatment, and for the past few years it's been resolution as well. If any console game comes in below the 1280x720 mark, it's automatically worse looking than any game that hits that res and the programmers who work on it are all incompetent and/or lazy.

March 8, 2011 at 10:18 AM

Blogger Unknown said...

I'm sorry that you've been demoralized by the internet.

This is quite the interesting outcome of user testing your game at 30hz.

It correlates with many complaints around the tv/movie industry of the "Soap opera effect", in which the between-frame interpolation of 30hz (video tape framerate, which soap operas were recorded at hence the name) and 24hz (movie frame rate) differs so much that people complain about it.

I wonder if this is due to the fact that either 1) viewers are used to the interpolation of 24hz movies and expect that to be the norm subconsciously, or 2) Movies are conveniently at a 'sweet spot' for human vision, in that it aligns well with our natural rate of processing images.

I would bet a guess that if the framerate was dropped to 24hz, people would like it just a hair better than the 30hz one. Closer to film is the industry trend, right?

March 8, 2011 at 10:23 AM

Blogger Mads Andreas Elvheim said...

I respect you for going with the decisions from your playtesters. If they favored the change, of course you should stick to it. Maybe it works fine for your particular game. But for games in general, I disagree with the claim that lower framerate does not matter.

I think it matters because the visual framerate decides how fast you can react. This is more important in some games than others. It doesn't matter if your game logic and user input can detect 1000 button presses per second if the game renders at 30 fps. The player is limited by the slowest part of the system.
It can be network latency, direct user input, render frequency or the frequency of the game loop. If a part of the system is slower than the others, it can not be remedied by the other systems. In my opinion, this is why it makes the most sense to render at a higher or equal frequency compared to the game loop or input polling. Everything else is just lost potential.

March 8, 2011 at 4:12 PM

Blogger DEADC0DE said...

Mads: again and again and again. Frequency != Latency. FREQUENCY IS NOT LATENCY.

So yes in general FPS are meaningless from a reactiveness standpoint.

Of course the minimum latency would be achieved in a 60fps game that goes from the pad input to the screen with no buffer, but in general the fact a game is at 60 does not mean it will have a lower latency than a game that goes at 30.

I'll say it again: _FREQUENCY_ _IS_ _NOT_ _LATENCY_

March 8, 2011 at 6:17 PM

Blogger DEADC0DE said...

NickC: Demoralized by the internet? I? No, that won't happen.

The "soap opera effect", I wonder if that's only a problem with cheap the temporal resampling, I wonder if the same would be noticeable using optical flow... Anyway it's a bit of a different issue because there you are resampling...

Also the outcome of our testing is not soooooo suprising: http://www.insomniacgames.com/blogcast/blog/mike_acton/1503082

March 8, 2011 at 6:20 PM

Comment deleted

This comment has been removed by the author.

March 8, 2011 at 7:45 PM

Blogger Mads Andreas Elvheim said...

Honest mistake, I didn't mean network latency obviously, but the number of updates sent per second over the network.

I'm very aware that the internal frequency a game runs at doesn't directly reflect the latency in the system. For my example, I obviously disregarded that the system does any buffering.

So again to be clear, assuming that there is little to no buffering, the overall system is limited to the subsystem with the lowest frequency. Do we still disagree?

March 8, 2011 at 7:58 PM

Blogger DEADC0DE said...

Mads: if there is no buffering othet than the gpu ring buffet, we agree, 60 will be more responsive. Now go and try to find a game that runs at 60 with no buffers. I date to say you never shipped an AAA game this gen...

March 8, 2011 at 8:08 PM

Anonymous Anonymous said...

But still, isn't 1/fps the lower bound of latency?

Consider this sequence of events:
t=0ms a frame is rendered
t=5ms the physics/game/whatever engine spits out an event
At 60fps, the user is notified at about t=16ms, at 30fps the user is notified at about t=33ms. Isn't this worse?

Or the numbers are such that computations take so much more than rendering to make this argument irrelevant?

I am not a game developer, so I don't know the orders of magnitude involved (especially, I am wondering what is the reaction time of the player).

To clarify: I am not asking to question your decision in the development of your game, I am genuinely curious of what the numbers are for a modern game on modern hardware.

March 9, 2011 at 8:20 AM

Blogger DEADC0DE said...

1/fps is indeed the lower bound. But knowing the lower bound is meaningless really. Do in practice games achieve their lower bounds? Nah, it's an exercise left to the reader to see why we could and do have usually a few frames of latency (hints: controller inputs and grammars; why the gpu is always at least one frame behind the cpu? Why does it need to buffer?)

March 9, 2011 at 8:50 AM

Anonymous Anonymous said...

So, why does the game engine run at 60 Hz? (And the physics at 120 Hz?)

Who cares if you determined the position of the ball at instant 1,2,3 and 4 if what you can show is just instant 4?

Am I to assume that you use the extra points so that the player actions can be applied between frames? Isn't it a bit unfair to the player that (s)he has to mentally extrapolate the situation of the next frame without knowing that other events could have happened in the mean time? Or is this considered part of the game?

March 9, 2011 at 11:42 AM

Blogger DEADC0DE said...

Anon: Because that provides a lower latency, of course!

Now there are some technical details that I don't want to talk about (and it's not only lazyness but also a matter of non-disclosure) but your reasoning is silly.

It's not that you show things at "instant" 4 as there is not an "instant" 4. There is input, then gameplay, then rendering, and you want to minimize the lag. The input let's say might get your pad state at time 0.1, game can compute the output at time 0.2, render can finish at time 0.5 and you get a 0.1-0.5 = 0.4 latency.

If the "game" part runs slower, you add more latency, right? Plus you can do other things, for example you might want to poll the input at 10000hz to detect some very slight movements and dunno, average or recognize a move or whatever. Then the game can run at 100hz because it wants to be very accurate at updating objects in small steps and dunno, recognize precisely collisions and physics response. Then the render can go at 30fps because that's what your game needs, you tested it and it looks and plays great.

Hope this helps.

March 9, 2011 at 1:36 PM

Anonymous FreakyZoid said...

" Isn't it a bit unfair to the player that (s)he has to mentally extrapolate the situation of the next frame without knowing that other events could have happened in the mean time?"

Not really, people do it all the time (imagine trying to catch a thrown ball, for example).

Based on your knowledge of the system you can make very accurate educated guesses as to what is likely to happen in the next half a second.

March 10, 2011 at 4:02 AM

Anonymous Biagio said...

Quite interesting point.

I'm wondering how the game engine design is affected by latency requirements, in particular multi-threading issues.

March 14, 2011 at 10:48 AM

Blogger DEADC0DE said...

Biagio: it can and it should be.

March 14, 2011 at 2:44 PM

Anonymous Ashe said...

"Of course nothing of this is true"

it IS true, no doubt about that - not even arguing.

30 fps is 100% okay for me but you can't deny 60 fps is an improvement over 30 fps just because 1.000s of people don't think so.

i agree other parts are more important (input lag, physics) but there's just no way going from 60 to 30 isn't a downgrade.


another topic would be 'good' 30 fps versus GT5-ish 60 fps (massive tearing and framedrops all over the place)

in that case i understand any developers and gamers to prefer the stable 30 fps :)

March 27, 2011 at 9:08 AM

Blogger DEADC0DE said...

Going from 60 to 30 is a downgrade, but that would have been plain stupid. We evaluated 30 with motion blur to 60 without, knowing that we could achieve both easily. We compared the two things side by side and we preferred 30 with motion blur. How is a better thing (at least to the eyes of the people that had the opportunity to test it agains he alternative) be considered a "downgrade"?

March 27, 2011 at 10:12 AM

You can use some HTML tags, such as <b>, <i>, <a>

Comment moderation has been enabled. All comments must be approved by the blog author.

You will be asked to sign in after submitting your comment.
Please prove you're not a robot