Zum 3DCenter Forum
Inhalt




High-End Chip G70: Only shimmering AF?

August 13, 2005 / by aths / Page 2 of 2


Videos to demonstrate the effect of undersampling on GeForce 6/7 series graphic cards

Note: If you experience stuttering during video playback, you can lower playback speed by pressing Ctrl+Cursor down in Media Player Classic. The required codec can be obtained by installing the latest version of Fraps. Media Player Classic should be configured to automatically repeat the video. During the first run, the video is supposedly going to stutter, but the next time it should run fluently. We also advise to disable the overlay playback and use VMR instead. This ensures 1:1 rendering of the video stream without a decrease in contrast or sharpness.

The videos are made by Damien Triolet, we have his permission to publish them here. We'd like to thank him for his efforts. The videos were captured in Unreal Tournament 2003. The first texture layer got full trilinear filtering in HQ mode (or A.I. off), while the other texture layers get reduced trilinear filtering only. Since this video uses a demo with only one texture, it is fully trilinear filtered in HQ mode. However, real games use more texture layers. This looks like a special application "optimization" (meaning quality reduction). Nvidia is silent about that, since AF test tools show full trilinear filtering on every layer. Remark: Do not trust texture quality test tools since the driver treats games differently.



(click on the picture for an alternative picture set without Javascript)

For easier use, these images are scaled down. You can see both pictures in full resolution here. You can see the difference for texture stage 1's (MouseOut) and 0's (MouseOver) treatment on these images. The primary texture (0) stage still gets full trilinear filtering in "High Quality" mode. This applies for any texture stage if you check it with a texture test tool. In UT, however, any non-primary stage gets heavily reduced trilinear filtering only. You can see the angle depency very well here, try to check in the videos also.

The videos were not rendered using the standard LOD bias of UT2003, but rather using a (correct) LOD bias of 0. This means: If the texture filter works correctly, there shouldn't be any shimmering effects. All videos are recorded with 8x AF enabled to render them. The video image size is 1024x768, original speed was 20 fps using a slow motion tool, and the playback speed was set to 30 fps.

We advise you to download just one single video first, to check whether your machine can play it appropriately. The high video resulution and the lossless codec result in high system load. Therefore we also offer a short description of what can be seen in each video.



"Quality" on a GeForce 6800 results in shimmering. Furthermore, one can see the only partially applied trilinear filter: "flickering bands" are followed by "blurry bands" (areas where the texture is too blurry). In our opinion, this mode shouldn't be named "Quality", but Nvidia decided to offer this "quality" as standard and advise to do all benchmarking with such textures.

 



"High Quality" on the GeForce 6800 is a borderline case: The textures look if they are just starting to flicker, while they actually just don't by a tiny margin. Like all cards of the NV40 and G70 series, the 6800 also shows angle-dependant differences in sharpness, caused by the inferior AF pattern compared to GeForce3-FX series graphic cards.

 



Nvidia's new card features a by far greater raw texture power than the GeForce 6800, but shows remarkably worse textures as well: The annoying flickering bands are obvious. According to Nvidia's Reviewer's Guide, though, this mode delivers "the highest image quality while still delivering exceptional performance." Honestly, do you agree?



When using the GeForce 7800's "High Quality" mode, shimmering is reduced and it now does look better than the GeForce 6800's standard mode (which, however, delivers quite poor image quality). Yet, the GeForce 6800's just flicker-free HQ mode can not be achieved: The GeForce 7800 can not be configured by the user to use AF without shimmering textures.

 



ATI's Radeon X800, even when using standard settings, seems to be far superior to any GeForce 6800 or 7800 already. There are areas which tend to flicker faintly, but altogether, only the angle-dependant AF reduction in the tunnel is distracting. The GeForce 7800's "High Quality" quality is clearly surpassed.



When turning off A.I. on the X800, no remarkable differences to activated A.I. can be seen.

 



As the reference card, a GeForce FX in "High Quality" mode was used. This shows us two things: Not all GeForce cards show shimmering, see ground and wall textures: They are absolutely "stable." Furthermore, the whole tunnel is textured as sharply as it should be when using 8x AF, because of the superior AF implementation (which Nvidia dropped for NV40 and G70).

 

Conclusion

ATI, with its Radeon X800, shows: Even with activated "optimizations" (meaning quality reduction), there are no shimmering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces', the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising speaking of "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department. At least, ATI shows that the scene in the video does not have to include texture shimmering.

Nvidia, with its current 7800 series, offers graphic cards that can not be recommended to lovers of texture quality–even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the tendency to texture shimmering in addition. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture shimmering as much as possible. With the 7800, this seems to be useless; even when using "High Quality", the new chip tends to texture shimmering. The old Nvidia GeForce FX shows nearly perfect textures, though.

The quoted passages from Nvidia's Reviewer's Guide can be easily disproved. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, versus a Radeon, are wrong: Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs. 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality. (Otherwise, why bother to have and benchmark with AF at all? Should an AF implementation just have only minimal impact on the rendering speed or should it result in greatly improved textures?)

What advantage do you have of 16x AF if you get 2x AF at maximum at certain angles only, are exposed to texture shimmering while other cards provide flicker-free textures? All benchmarks using the standard setting for NV40 and G70 against the Radeon are invalid, because the Nvidia cards are using general undersampling which can (and does) result in texture shimmering. We know and love GeForce 7 for the Tranparency Antialiasing and the high-performance implementation of SM3 features, but the GeForce 7 series cannot be configured to deliver flicker-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flicker-free AF quality.

If there should be any changes with new driver versions, we will try to keep our readers up-to-date.


Addendum:

With Forceware 78.03 and higher, the texture flickering in the HQ-mode is gone. Compared to the old flickering HQ mode, the impact in the fps rate is only about 2-4%. But with default driver settings, the anistropic filtered textures still shows flickering. Therefore one should bench these cards (GeForce 6 and 7) only in the HQ mode, because the competition (with "A. I" set to "low") does not show such flickering.






Kommentare, Meinungen, Kritiken können ins Forum geschrieben werden - Registrierung ist nicht notwendig Zurück / Back 3DCenter-Artikel - Index Home

Shortcuts
nach oben