Original Post — Direct link

Antialiasing currently present in Dbt doesn't really work good. As far as I can judge by what I see, it's (a variation of) FXAA or something other, done completely in postprocessing. It fails to smooth a lot of stuff, and also it doesn't actually improve precision of edges, so when you're moving the camera, you see little jumps / flickering on the edges.

To give some comparison:

QC antialiasing is about as bad as in Diabotical. (if not exactly the same)

QL can be forced to run with proper MSAA using driver settings, and then it looks good. (comparison is made a bit harder by this, since there's no way to enable it in game itself)

CoD Warzone uses some different post-processing technique, not a true MSAA, but they somehow manage to make it look good, lines are smooth and not flickering on movement. Can Dbt use this, perhaps, if MSAA requires too much changes to rendering architecture?

External link →
about 4 years ago - /u/GDFireFrog - Direct link

Originally posted by Field_Of_View

Temporal AA + sharpening filter is the only way to remove aliasing from modern engines fully. Unfortunately it can lead to ghosting / smearing artifacts and it must be very complicated to implement since so many games have bad implementations. I wouldn't hold my breath for this feature.

Yes, that's right. I would like to avoid temporal techniques (which is also probably what's used by CoD Warzone which is cited as example in OP judging by the description) since it adds a bit of visual "half-latency" that doesn't really fit the genre, because what you are doing is mixing the two last frames while jiggling the camera to detect edges. But it's not that it's complicated to implement property, it just has that blurring effect. It is used a lot nowadays because it's actually very neat: it gives you good performance and good static quality, and the downside that it has which is the blur may even turn into an upside for your game if you like the effect. DB requires a more snappy old-school 1:1 frame to presentation temporal ratio, IMO.

Now, regarding MSAA proper, it should be no problem to do that. I can't think off the top of my head of any feature that we've added over the years that would be a problem with that, but maybe I'm forgetting about something, I'll come back to this thread and update with progress. A MSAA implementation is actually inside the engine code, just surely rotten at this point and it would just be a matter of bringing it back and updating some shaders. Note that SMAA is better than MSAA in some regards, SMAA will smooth out anything on the screen whereas MSAA will only detect certain categories of aliasing accidents, although the things that it does detect will be detected reliably. Having a MSAA+SMAA option is something we could do to get the benefits of both and it should still be faster than SSAA (more on this below).

The reason we just have SMAA is because it was a better choice at the time, performance is a priority, and SMAA is faster and it looks just OK (and some people even prefer the crispness). This is why SMAA is used in so many games from a certain generation when MSAA was also available. This engine was actually started a long time ago, and at least in the system in which I tested different techniques back then MSAA performed terribly compared to SMAA. They are probably much closer in current GPUs.

In the meantime, and this isn't helpful for most people, note that if fragment rendering performance is not an issue for you (i.e: you have a really good GPU, and play with FPS capped) you also have SSAA (supersampling) in the game that will give you perfect anti-aliasing. This is just done by scaling the game with the video scale setting to more than 100% (ideally 200%). Some people play like this. It may work for you until we add MSAA. Keep in mind SSAA will always be superior to MSAA as it's perfect anti-aliasing, it's taking into account the real state of each pixel because it's rendering the whole thing, so I would expect in the future (specially seeing the numbers for the RTX 3000 series) it will be a preferred default for people with high-end PCs who are uninterested in uncapped FPS. I realise while typing this that we should have added SSAA to the anti-alias options which would just duplicate the scale in case people don't realise this.

about 4 years ago - /u/GDFireFrog - Direct link

Originally posted by apistoletov

It would be really strange if modern engines are all such a step back in terms of picture quality.

It is counter-intuitive but this is true in a sense. To understand when things became worse in this area check the Wikipedia article for "deferred shading", it's pretty good.

A high level philosophical summary to explain the technological regression would be that as sophistication of the engines increased, the inner workings of the engine became opaque to the GPU, that couldn't discern the information that the engine was working with anymore. Information was lost to the GPU because the engine assumed some of the responsibility and started sending the GPU pre-rendered snapshots that the GPU just needs to put together.

It's an analogous phenomenon to game streaming. New innovation allows a controller to assume some responsibility previously belonging to a terminal component. The terminal component now has less information. This allows better performance in some regards but results in loss of functionality and some people wondering why we are moving backwards.

about 4 years ago - /u/GDFireFrog - Direct link

Originally posted by apistoletov

Thanks for explanation! But then, do you think Diabotical can eventually get something at least similar to MSAA?

Yes, it should be no problem to implement MSAA. I don't want to 100% promise you that off the cuff because I need to test it, there are certain limitations with multi-sampled buffers and I may be doing something in some part of the engine that may be an issue with that that I may be forgetting about. I'll update in this thread when I've tested.

about 4 years ago - /u/GDFireFrog - Direct link

Originally posted by apistoletov

I understand that just rendering everything twice as big in each dimension and scaling down the result is basically what SSAA does, and it has to look good, but it's way too expensive -- basically 4x the work if I understand correctly, and my comparison confirmed that. In contrast, MSAA is supposed to only compute extra samples at the edges of geometric primitives, which is usually a small % of screen space.

I play with capped FPS, but even with the same FPS there's a difference in latency depending on how fast a single frame can be drawn.

If you re-read what I've said carefully, you'll find I've addressed all of that :) Yes, SSAA is not a proper solution for most people. And yes, there's a difference even in that scenario but the point is we are thinking in that scenario of setups that could run the game at very high fps, so the extra time that you are adding to each single capped frame is very small, and not the kind of latency that makes things feel bad, so I still expect more people to just use SSAA as years go by. I do agree with the sentiment, I would not want any extra latency either. In any case, we should be able to implement MSAA, so you can choose.

about 4 years ago - /u/GDFireFrog - Direct link

Originally posted by apistoletov

Sorry, I replied to this comment before I saw the other comment

Np :)