Originally posted by Freezman13
For example, a dip from 420FPS to 300FPS sounds dramatic, but in terms of frametime, you're going from 2.38ms -> 3.33ms (a one millisecond increase). Respectively, Going from 300FPS -> 140FPS is a 3.8ms frametime increase. We care a lot about maintaining smooth performance, so we don't want to see a lot of swings here, but 1-3ms deviations when activity is happening aren't completely unacceptable or unexpected.
The change in wording does nothing except minimize perceived impact. Saying 'oh it's just a one millisecond increase' seems disingenuous. Performance dropped by a third, it doesn't matter how you word it.
Regardless, the real question for me is, if your normal FPS is ~140 does it drop by the same % ? That I'd personally consider a big hit.
Then again I'm not much of an FPS player and RN getting 40 frames in big fights of my survival game is a blessing.
That's a good question, and sort of explains why I went into frametime.
Let's say that we introduced a change that makes shooting take 4ms longer on ALL configs. For someone playing at 2.5ms frametime (400FPS), that is a BIG FPS hit (6.5ms total, 153FPS).
Now let's look at someone playing at 6.94ms (144FPS). They change to 10.94ms (91FPS).
Frametime helps us to understand the absolute measurement of how fast or slow the game is, whereas FPS drops are relative.
If you told me "I lost 40FPS this patch", it's a meaningless number without understanding what your FPS used to be.
If you told me "I lost 10ms frametime this patch", it's more useful (although the context of your prior frametime is still relevant).