Zum 3DCenter Forum
Inhalt




Will "brilinear" filtering persist?

October 26, 2003 / by aths / page 2 of 2 / translated by 3DCenter Translation Team


   A statement of profit and loss

The maximum fill rate has always been a highly inaccurate specification to estimate the real performance. Rendering of 3D Graphics is a complex process, there’s virtually no game that comes close to using large amounts of the theoretical fill rate of a graphics card. The extra work caused by trilinear filtering is chip-internal and is mostly done with ease. Switching to bilinear filtering however doesn't mean for sure that you get twice as high frame rates.

"Brilinear" is, in practice, only giving a marginal boost in performance. However, the main problem is that there is no reasonable alternative to compensate the disadvantages. Let’s remind ourselves of texture improvement by supersampling. Nowadays textures are improved by anisotropic filtering, being far more efficient than supersmpling. Whereas supersampling would use 16 samples per pixel, anisotropic filtering can achieve the same quality with a maximum of 4 samples.

There isn’t such an option in the "brilinear" filter. When working with higher resulutions, the basic problem of the "brilinear" filter is still existent: Sensitivity for "bow waves". These are generally occuring if filtering is not fully trilinear. Though, visibility is another question.

To be a bit more precise: Generally this artifact does also exist with trilinear filters. But the "bow waves" are stretched to the maximum reasonable amount (These are caused by MIP mapping, upon which trilinear filtering is based) so that they are virtually disappearing. We have prepared some synthetic screenshots here:


Bilinear filtering is no longer up to date.

Trilinear filtering delivers much better transitions.

When using "brilinear" filtering, the visibility of detail level changes increases. When moving, this effect is more distinct than on a still picture.

What can be done against bilinear filtering is also applicable for "brilinear" filtering: anisotropic filtering. There will be another upcoming article addressing that issue, so we will just make a short excursion. AF displaces MIP levels "away" (e.g. in the background), artifacts caused by MIP banding are therefore less visible. On the other hand, AF is consuming a huge amount of performance. Now, you could argue that no one would play without AF today, but there are two remaining objections:

The first is the fact that Nvidia is also "optimizing" anisotropic filtering. The degree of anisotropy is reduced both angle-dependent and in general. 8° AF should mean every pixel getting such amounts of anisotropic filtering being necessary to increase quality; up to 8 AF-Samples and finally then aliasing is avoided through blur. With Nvidias "optimizations" enabled, 8° anisotropic filtering is applied to very for small areas. Thus the MIP volumes are pushed less far to the back, than otherwise possible.

Now angle-optimization is not obligatory in most drivers. For a short time, the user is forced to apply only a maximum of 2° anisotropic filtering except for the primary texture stage. In Max Payne, where certain basic textures are on stage 1, this is clearly visible: blur. The GeForce4 Ti series could be reconfigured by the registry or – more easily – by means of tools like RivaTuner or aTuner. This makes sense as an option (unfortunately, newer drivers do not support this anymore) but should always remain an "opt-in" choice.

The second objection concerns the small GeForce FX graphic cards featuring only limited amounts of fill rate. High AF-levels are not to be run a priori, as a result limitations of brilinear filtering can not be concealed so easily.

For a fairly long time Nvidia forces FX-users to cope with "brilinear" filtering in UT2003. According to our tests, the actual quality difference is marginal. In this column, we admit to have a less technical point of view and claim every option, that sacrifices a very small amount of image quality for a noticeable performance increase, to be preferable per se. The most important word in this sentence is "optional".


   Forced to higher frame rates?

Nvidia advertises the FX series with "Cinematic Computing" and "Engineered with passion for perfection" and does not, however, allow use of true trilinear filtering in any Direct3D applications. We consider that ridiculous. Nvidia defines how the client has to use its "Cinematic Computing" hardware: Performance-optimized. The user is deprived of the option to get the best quality possible.

For UT2003 requiring plenty of power it could be absolutely senseful to lower the filter quality. MIP banding is virtually eliminated by "brilinear" in UT2003. But why should this, according to Nvidia, be the only choice to play UT2003? Why is this paradigm extended on all Direct3D applications and games?

ATI marketed 16° AF aggressively in times of the original Radeon and Radeon 8500. If anisotropic filtering was turned on, trilinear filtering could not be used due to a hardware limitation (up to Radeon 9200) so the user had to stick with bilinear anisotropy. This fact and the extreme angle dependence of Radeon AF was justified as a reasonable limitation: Important areas would get anisotropic filtering and trilinear AF would consume far too much performance for such a small quality increase.

You cannot let that one unanswered. GeForce3 users often used only bilinear AF, but in less demanding games, they could enjoy trilinear AF. The relatively faint angle-dependence of anisotropic filtering on GeForce3 series graphics cards was also contributing to good texture quality. At ATI no one ever thought of taking the user trilinear filtering (without AF). Correctly trilinear filtered textures are possible on every Radeon – this is the minimum a decent 3D graphics card has to offer.

Let’s summarize our concrete reservations against Nvidias new filter: Quality gained by trilinear filtering is partially destroyed. There is no resource-saving option that could correct this shortage. Furthermore, we doubt the sense of these "optimizations": Fast graphics cards gain frames you don't need anyway (and sacrifice quality you would probably want). The entry-level cards are not fast enough to display modern games at high resolutions fluently.

The up to now irrelevant "shader power" will gain strong significance in the future. Just because the GeForce FX does not have very fast shaders there is enough time to use high-quality texture filters at least. After all, this is the only advantage of Nvidias graphics cards: Shader power and anti-aliasing quality are clearly dominated by the competitors. No one has to offer such high-quality anisotropic filtering à la GeForce 3 (and successors) but Nvidia does not do anything with this advantage. Of what use is AF that is based on the "brilinear" filter and takes over its disadvantages?

Offering the user options is greatly appreciated by us per se, even if the options do not make much sense in detail. "Brilinear" could be the crucial plus in certain situations to gain playability by accepting hardly visible quality losses. Therefore we see possibilities for this option. In the longer run we assume that high-level trilinear anisotropic filtering – or even better techniques (there are better techniques than trilinear filtering) will prevail. In our opinion it is somehow paradox to develop sophisticated pixelshaders but to continously decrease texture filter quality.

Thus a flat aftertaste remains: The GeForce 4 Ti is being slowed down artificially at the moment by the drivers (AF-Stage optimizations do not work anymore and apart from some benchmarks, Early-Z-Occlusion has been deactivated). The FX series are, however, pushed – by means of forced worse image quality, which the user can not deactivate, even if he wished to. It is somehow paradox that Nvidia advertises the new drivers with the release highlight "enhanced image quality for both anisotropic filtering and anti-aliasing".

To be constrained to accept missing true trilinear filtering is unacceptable in the year 2003. To avoid nitpicking, we see the divers aggravations of AF quality "only" disapproving (besides also ATI takes part in these useless games) To deprive the user of trilinear filtering seems to be an act of desperation from Nvidia. The introduction of "brilinear" filtering, planned a long time ago was surely not thought of as the maximum quality. If Nvidia is not going to correct this very quickly you cannot see the products as a worthy piece of hardware for a computer gamer anymore.






comment this article in our english forum - registration not required Zurück / Back 3DCenter-Artikel - Index Home

Shortcuts
nach oben