Zum 3DCenter Forum
Inhalt




Detonator 52.14 Test & 52.10 Re-Test

October 8, 2003 / by Leonidas / page 7 of 7


   Conclusion

Finally, let's draw the conclusion of everything we found out in this article. First of all, let's have a look at the discovered "optimizations" of the Detonator 45.23 and 52.10/14, but only related to the GeForceFX series.

  • The Detonator 45.23 shows an exemplary filter quality for both OpenGL and Direct3D. However, an application specific "optimization" has been found for Unreal Tournament 2003, which can be deactivated by using the Application mode.

  • The Detonators 52.10 and 52.14 instead are showing us a lot of "optimizations" for Direct3D filtering, but seemingly neither an application specific "optimization" nor an "optimization" for OpenGL. You could say that in Direct3D, generally all texture stages are filtered by this faked trilinear filter, regardless of the filter setting forced by the control panel or by using the Application mode. In addition to that, there is another "optimization" when using the Control panel mode (not the Application mode), where the texture stages 1 till 7 are only filtered with a 2x anisotropic filter at the best.

On the credit side we’ve got a clearly better Shader performance for both new drivers, but on the debit side we’ve got the already known Unreal-Tournament-2003-"optimization" now for all games which do not have the possibility to set the level of anisotropy themselves and also the fact that the trilinear filter was given up in favor of the faked trilinear filter, regardless of the filter setting forced by the control panel or the settings made by the game.

Because of the new level of "optimizations", we cannot see a real performance increase, apart from the Shader performance. In our opinion, a few percent more performance don’t justify the "optimizations" that were made at all. They are only making sense (if you can call it that way) on closer inspection of the Unreal Tournament 2003 Flyby benchmark. There, you can even double the performance by "optimizing" the filter quality.

And what’s the benefit for the consumer? Almost nothing. The real performance advantage in the game, e.g. Unreal Tournament 2003, is clearly below 10 percent, which is not more than 2-5 fps. The Flyby rates are still theoretical measurements and will remain theoretical measurements, which of course can show quite good the raw performance of a graphic card, but tell absolutely nothing about the real game performance of Unreal Tournament 2003.

That’s exactly the reason for all those "optimization" efforts by both, nVidia and ATi: The unfortunately much too big and completely unjustified importance of the Unreal Tournament 2003 Flyby benchmark. Just because this benchmark is used so often nowadays, it decides over the performance of a graphics card in a performance review and again that’s the reason why they are so heavily "optimizing" for Unreal Tournament 2003. If the reviewers on the Internet would collectively give up this Flyby benchmark (or alternatively use a timedemo), then the incentive for ATi and nVidia could disappear to "optimize" so much for this Flyby benchmark only.

This obviously does not excuse the behavior of nVidia "optimizing" the Detonators 52.10 and 52.14, where now all of a sudden the old "optimization" for Unreal Tournament 2003 applies to all games. The performance gain obtained therefrom is way too small to justify this "optimization", if you take anything into consideration – and at the same time nVidia forces the customer to use their own faked filter for the majority of games. That's because whoever plays a game which has no possibility to set the anisotropic filter on its own has got only one means to set it up, namely by the means of the Control panel, where the user is forced to use the faked trilinear filtering method. With this, nVidia offers its GeforceFX customers no correct trilinear filtering for the majority of currently available games!


To justify this "optimization" on the anisotropic filter more cowardly, sadly this is also used by ATi, nVidia now offers a reasonably working Application mode with it where it is possible to avoid the "optimization" sometimes. But keep in mind that there are only a few games outside which have the possibility to set the level of anisotropy and only with them you can use the Application mode in an efficient way.

Unfortunately the Application mode isn’t that what it used to be, because the applications (the games) don’t decide over the anisotropic filter themselves, but rather the nVidia driver does, which generally uses the faked trilinear filter, instead of the correct one.

This represents a new stage in the history of driver "optimizing", because nVidia hurts a clear and fixed standard. In contrast to the anisotropic filter, which does not have an exact definition, the trilinear filter is produced based on an exact definition, which nVidia hereby clearly and obviously breaks. If these drivers or a driver with similar filter behaviour should be published officially, than nVidia should not permit themselves to put the trilinear filtering onto the feature checklist of their GeForceFX graphic chip series.

Of course there is the possibility that this behaviour of the Detonators 52.10 and 52.14 are still a bug, which will be fixed in a later version of the driver. The upcoming driver will finally show how serious nVidia takes the compulsory pseudo trilinear filter of the Detonators 52.10 and 52.14.


At this point we are honestly anxious to see, what nVidia is going to do. If a driver with the filter behavior like the Detonators 52.10 and 52.14 would be actually published, than this would be cannibalized by the other competitors on the graphic chip market. However, hopefully for the purpose of various libellous pamphlets and hopefully not in this form, when then ATi and XGI drivers with the same "optimization" of the trilinear filter as in nVidia’s Detonators 52.10 and 52.14 would appear. We hope that the "optimizations" of the nVidia driver 52.10 and 52.14 do not cause the other graphic chip producer to pull tight with nVidia according to their "optimizations" by making own "optimizations".

However, we can’t really imagine that nVidia is going to official release a driver with such various "optimizations", which doesn’t stop at such absolutely certain thing like the trilinear filter. The user proclamation on this article, which covers only half-internal beta drivers, might be big enough – not to imagine the reaction on the News sides and in the Internet forums, if such a driver would be officially available for download on the nVidia server.

As far as that we don’t want to paint the devil on the wall at the moment and we assume first once that the upcoming official Detonator 50 at least won’t contain compulsory "optimization" by the pseudo trilinear filter. Everything else would let us strongly surprise, because nVidia also reads along this article and can easily count the users reaction at two fingers, how "cordially" a driver like 52.10 or 52.14 would be taken up, if them where released officially.

With which we are almost at the end of this article. We would like to finish this article with the demand from the article about the Detonator 51.75 "optimizations", which still have existence and we will therefore find them here in a 1:1 repetition:


Forced "optimizations", which speed up games at the cost of image quality and so, that in most cases anyway sufficient fast games where nonsensically made even faster (but uglier), are definite not desirable. Therefore we demand the graphic chip manufacturers ATi and nVidia in the name of a large part of the gamer community hereby to offer an option in future drivers, which can afford a free of "optimizations" exemplary quality. These modes do not have to be necessarily the drivers default setting, but if ATi and nVidia want to continue their "optimization" race any further, they can willingly do this.

However, for those who set on the maximum image quality, both manufacturers should finally integrate an "optimization"-free "Super HighQuality Mode" in their drivers. Within this mode, the consumer should also be able to select whether to use anisotropic filtering and anti aliasing, without any forced "optimizations" made by the driver or not. Technically such a "super high quality mode" won’t be difficult to implement and will bring up clearly less time for programmers, instead of programming all those "optimized" modes. So far, both manufacturers should bring themselves to offer their customers the highest image quality (maybe optional) available on the graphic chip market, which with ATi and nVidia are already promoting anyway at present.




Added on October 9, 2003:

AnandTech made an extremely extensive article about the performance and image quality of the current high-end graphic cards like Radeon 9800XT and GeForceFX 5950 Ultra (NV38). Beside the game benchmarks with 18 games, the image quality tests made with each of those games are strongly worth to be mentioned. AnandTech uses the Catalyst 3.7 on ATi side and the Detonator 52.14 on the nVidia side to compare the image quality. In contrast to the statements of our youngest driver comparison, AnandTech didn’t notice any general differences of the image quality between the Detonator 52.14 and 45.23 and therefore AnandTech praises the new driver a little into the sky.

This however not even absolutely contradicts itself with our realizations. The nVidia-"optimizations" of the anisotropic filter with texture stages 1 till 7 in Control panel mode (only a 2x anisotropic filter is uses, regardless if there were made higher settings) are only to find with proper searching for it, besides most image quality comparisons by AnandTech were concerned without the anisotropic filter and therefore it’s impossible to find any differences on those pictures. The generally forced "optimization" of the trilinear filter into a pseudo trilinear filter by the Detonator 52.14 is besides nearly not possible to see on fixed images of real games, because the trilinear filter was created in order to prevent nearly only the MIP-Banding which can be seen in motion.

Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly (why however AnandTech awards the driver 52.14 a finer filter quality than the driver 51.75 is a mystery for us, then the only difference between them is a correctly working Application mode of the Detonator 52.14). Thus the "optimizations" of nVidia are not to be really seen, whereby there is also a clear exception as for example Tron 2.0. Whether this is now a reason to excuse the "optimizations" of nVidia about it, one can surely argue.

Fact is that each new "optimization" will continue to heat up the "optimization" race between ATi, nVidia and possibly in future also XGI (we’ll see, if S3/VIA really joins this year with the DeltaChrome) - after an "optimization" of manufacturer A a new, stronger "optimization" of manufacturer B thus will follow. All these "optimizations" – to be recognized as from the AnandTech article - won’t be seen on real screenshots in comparison to their direct predecessor driver. But adding several "optimization" stages of several drivers, with the time an immediately perceptible image quality difference will adjust itself. We are going to reach a point sometime, having a GeForceFX III and a Radeon 20000 and it will be allowed us to admire all games under a bilinear obligation filter, if the spiral continues spinning ;-).

Nobody actually wins with such "optimizations", least of all the consumer, no matter how small they affect the present ascertainable image quality. The performance gains because of the "optimizations" we discovered are clearly less then 10 percent. With that you won’t be able to speed up a slow graphic card like the GeForceFX 5200, which is confronted with a difficult task like Unreal Tournament 2003 with 8x anisotropic filter (16.7 fps), into a playable range. Because 16.7 fps plus 10 percent are still only 18.4 fps, which doesn’t change the sentence of being unfit. The favorite argument of the graphic chip manufacturers to driver optimizations, to speed up smaller cards, doesn’t work anymore. Because then these optimizations would have to fit in the context of 30 percent of performance plus and not clearly below 10 percent.

Thus the only explanation for the determined "optimizations" of the Detonator 52.14 (and also ATi drivers have "optimizations" as you know) with such slight performance gains, is to win benchmarks of the hardware testers. In a situation like this, where the contractors in each of the individual market segments lying so close together, those few percent "optimization" profit are quite sufficient, in order to make a victory from a defeat with the performance measurements. Leading by only a small difference plays a subordinate role the only thing, which is important is the psychological advantage of being victorious.

And at the latest here we find again to our firm opinion against all these "optimizations": For owners of low cost graphic cards no game will be speed up enough, that to rise these cards up into a higher performance class (so that it would make sense). And for the owner of high end or mainstream cards these performance gains usually aren’t relevant, because those graphic cards are fast enough anyway. The image quality given up by the graphic chip manufacturers by means of their "optimizations" serves alone the performance contest of the manufacturers among themselves and in the long run, this thing brings nothing at all for the consumers (except a worse image quality). If the manufacturers would really care about "customer satisfaction", which is a declared aim of all company taken part at this, the entire "optimization" work would be completely unnecessary.






comment this article in our english forum - registration not required Zurück / Back 3DCenter-Artikel - Index Home

Shortcuts
nach oben