Our previous material dealt with the differences between X800 and GeForce 6800 filtering realisations in a RightMark synthetic test. As you remember, the picture was quite ambiguous and in general, 6800 trilinear filtering optimisation brought about a somewhat lesser quality fall than X800, although a parity could also be recognised.
............
Concerning the games, they have shown that there are but slight differences between various realisations of the mentioned filterings.
............
We leave it up to our readers to decide what is better, but I personally consider the approach used in GeForce 6800 more correct, and simplified trilinear filtering is less visible there. And besides, if the user forces anisotropy in a game that can manage the function itself, then its realisation will be correct and correspond to the norms on GeForce 6800, while it will disappear whatsoever on X800. FarCry can serve as a perfect example for this. The game only allows four degrees of anisotropy, which is not enough for users of powerful cards. Thus, any owner of an expensive accelerator will set anisotropic filtering in the drivers. As a result, GeForce 6800 will work correctly while RADEON X800 will turn off trilinear filtering. Surely, ATI fans will be quick to remind me of certain problems related to NV3x/4x's shader calculation precision in this game. Well, that's really a drawback, but patch 1.2 is to appear soon and we'll see if it can correct it.
............
Summing up, I would like to say the following thing. Despite multiple accusations that ATI or NVIDIA deceives users, the main problems lie not in the realisation of trilinear filtering or even anisotropy. They are driver/game bugs that result in real artefacts and not in some trifles found by a meticulous admirer of the rival company. Open our 3DiGest section that deals with quality in games and you'll see that there are claims about many other things which bad programmers are really to blame for (including game developers as it's their mistakes that cause almost a lion's share of all problems).
Therefore, no matter how indignant fans of "pure graphics" are (in fact, they are usually fans of either ATI or NVIDIA), an overwhelming majority of users will see NO DIFFERENCE in most games due to all these optimisations.
............
So, is it good or bad? I think everyone should be able to choose. Even if a user never sees the difference and never enables optimisation, he still deserves to have a choice. If a washing machine lowers the frequency on the Q.T. and underwrings out the clothes, most householders will hardly notice it, but inevitably, there will be some people who will make a huge row out of it. And the manufacturing company will have to apologise for this unintended effect (as was the case with ATI). But if that washing machine had a mode regulator, each could choose the frequency himself and would be aware of his actions. Then there would be no claims, and NVIDIA would have no reason to issue humiliating presentations concerning ATI products. In the end, the Californian company turned out to be fairer, although it used to be the main focus of reproach for optimisations.