Original Link: https://www.anandtech.com/show/600



NVIDIA's insistence on a 6-month graphics card product cycle definitely pushes the industry to improve at a very fast pace but at the same time, it does cause quite a bit of frustration to those that spent $300 on a card 6-months ago that has only been replaced by a much faster product at the exact same price.

Upgrading computer hardware can be just as addicting as improving your stereo system, or tweaking your car, and unfortunately it can be just as expensive as both of those hobbies.

Luckily, every now and then, there is the occasional driver update from NVIDIA that gives us the 1 or 2% increase in performance that somehow tides us over for a little while longer before the next upgrade.

Today, alongside the GeForce2 Ultra, NVIDIA is releasing a brand new set of drivers that not only make the GeForce2 Ultra perform at the best of its ability, but claim to improve performance on virtually all other NVIDIA cards as well.



NVIDIA's Unified Driver Architecture

We've all heard of NVIDIA's "Unified Driver Architecture" (UDA) and we've all experienced what it enables us to do as end users. For those of you that aren't familiar with it, basically what the above picture is demonstrating is that on every chip NVIDIA releases they have a small portion of silicon that is indicated by the green Unified Driver Hardware Application Layer (HAL) on each GPU (GeForce, GeForce2, GeForce2 Ultra). At the same time, each driver release contains code that interfaces directly with that Unified Driver HAL, so even before the GeForce2 Ultra was out, NVIDIA had drivers that could run the chip.

According to NVIDIA, this allows them to stick to their 6-month product cycles since very little time has to be spent developing drivers. Of course individual optimizations are made later on to get the most performance out of each chip, but as far as getting reliable, fully functional drivers for a brand new graphics chipset, NVIDIA's UDA allows them to do so with very little effort.

At the same time you'll have to remember that the past three graphics chipset releases from NVIDIA (including the GeForce2 Ultra) haven't been all that different from one another so making the drivers adapt for use with each product release shouldn't be that difficult. The NV20 should pose an interesting problem for NVIDIA as it may include some new rendering techniques that are obviously not currently supported in the current Detonator drivers but if they stick to their UDA they should be able to, at least according to them, have working drivers for the NV20 even before the chip is taped out.



Digital Vibrance Control

If you recall from our GeForce2 MX (NV11) Review,one of the features unique to the GeForce2 MX (and the Quadro2 MX) is its support for something NVIDIA likes to call "Digital Vibrance Control" (DVC). We referred to DVC as nothing more than "glorified gamma control" in our MX review, and now with the new Detonator3 drivers we actually got a chance to play around with DVC (the older drivers could not enable it).

Below is the sample picture NVIDIA provided us with that illustrates DVC's capabilities. The left side of the picture does not have DVC enabled but the right side of it does:

Our experience with DVC wasn't nearly as dramatic as NVIDIA's picture above illustrates, what we did notice was that adjusting the DVC settings (there are three levels of DVC that can be enabled, or it can be shut off completely) seemed to make the screen brighter as if we tweaked the individual gamma levels perfectly.


Finally, Digital Vibrance Control support

NVIDIA will argue that DVC doesn't work the same way a gamma slider does because DVC supposedly adjusts the image data after it leaves the frame buffer but before the RAMDAC, thus adjusting the actual data not the image being outputted. While this may be true, the fact of the matter remains that DVC still yields the same end results as tweaking your gamma settings, it's simply easier.



TwinView

NVIDIA's Detonator3 drivers also feature a TwinView control UI. Once again, TwinView is another feature exclusive to the GeForce2 MX and the Quadro2 MX. Below is a description of TwinView from our GeForce2 MX (NV11) Review:

It was only a matter of time before another company caught on to Matrox’s success with their DualHead technology.  The GeForce2 MX’s second unique trait is that it supports a feature NVIDIA calls TwinView, basically their version of Matrox’s DualHead. 

Like Matrox’s DualHead, the MX’s TwinView technology enables support for two displays with a single chip.  The setup is a bit more flexible than DualHead because the GeForce2 MX supposedly has two integrated TMDS channels allowing for two digital flat panels to be driven off of a single GeForce2 MX chip.  The rest of the configurations are as follows:

  • Two CRT monitors with the use of a second external RAMDAC
  • Two analog flat panels
  • One digital flat panel and one analog flat panel
  • One digital flat panel and one RGB monitor
  • One digital flat panel and one TV
  • One RGB monitor and one TV
  • One RGB monitor and one analog flat panel (with second RAMDAC)
  • One analog flat panel and one TV

Again, just like DualHead, TwinView allows for various “modes” of operation with two monitors. 

The spanning mode allows for your desktop area to be spread across the two displays. 

And Clone mode displays the same image on both screens:

We were disappointed to see no support for Application exclusive mode (lets a single application be assigned to a specific display, this includes DVD playback on a TV as your second display) or application zoom mode (allows the second display to act as a zoomed in portion of something on the primary display). These two features were very popular among Matrox G400 users that had DualHead, and they were left out of the Detonator3 Drivers, making the new TwinView UI basically like Windows 9x/2000's default multi-monitor control panel, only with support for the single card/multi-monitor solution that is the GeForce2 MX.



Much Improved FSAA Control Panels

NVIDIA finally improved the FSAA Control Panels for Direct3D and OpenGL, you can see below exactly what we're talking about:

In Direct3D, NVIDIA actually lists the antialiasing method they're using at each slider point, unlike the older drivers which didn't list anything at all. Since even the middle setting is way too high for most users' preferences, it's a very good thing that NVIDIA lists the mode at each slider point.

Under OpenGL, you can finally adjust the FSAA method without having to edit registry keys.

For more information on the various FSAA settings, check out our FSAA & Image Quality Comparison.



The Test

For the testing, we used the same systems as were used for the GeForce 2 GTS review, with updated drivers. In the case of the Radeon, we tested with the shipping drivers with V-sync disabled as well as "Convert 32-bit textures to 16-bit" turned off. We only tested on one platform because we are comparing the performance of the drivers not the video cards themselves.

We used a 1GHz Thunderbird in order to eliminate any potential CPU bottlenecks. There was no point in testing any of the 64MB cards since the performance difference between those cards and their 32MB counterparts is negligible as we've already proven in our 64MB GeForce2 GTS & GeForce 256 reviews. We didn't include GeForce2 Ultra scores because there is no reason to run the GeForce2 Ultra with older drivers.

We left out the 800 x 600 scores since they didn't really show anything with these cards other than a small drop in performance when compared to 640 x 480 frame rates.

Oh, and for those of you that are wondering, no, the new drivers don't fix the horrible looking sky we noticed with S3TC enabled.

Windows 98 SE Test System

Hardware

CPU(s) AMD Athlon (Thunderbird) 1GHz
Motherboard(s) ABIT KT7-RAID
Memory 128MB PC133 Corsair SDRAM (Micron -7E Chips)
Hard Drive

IBM Deskstar DPTA-372050 20.5GB 7200 RPM Ultra ATA 66

CDROM

Phillips 48X

Video Card(s)

NVIDIA GeForce 2 MX 32MB SDR (default clock 175/166)
NVIDIA GeForce 2 GTS 32MB DDR (default clock - 200/166 DDR)
NVIDIA GeForce 256 32MB DDR (default clock - 120/150 DDR)
NVIDIA GeForce 256 32MB SDR (default clock - 120/166)

NVIDIA Riva TNT2 Ultra 32MB (default clock - 150/183)

Ethernet

Linksys LNE100TX 100Mbit PCI Ethernet Adapter

Software

Operating System

Windows 98 SE

Video Drivers

NVIDIA GeForce2 MX 32MB SDR - Detonator 5.22 / Detonator3 6.17
NVIDIA GeForce2 GTS 32MB DDR - Detonator 5.22 / Detonator3 6.17
NVIDIA GeForce 256 32MB DDR - Detonator 5.22 / Detonator3 6.17
NVIDIA GeForce 256 32MB SDR - Detonator 5.22 / Detonator3 6.17
NVIDIA Riva TNT2 Ultra 32MB - Detonator 5.22 / Detonator3 6.17

Benchmarking Applications

Gaming

GT Interactive Unreal Tournament 4.04 AnandTech.dem
idSoftware Quake III Arena demo001.dm3



With the GeForce2 GTS, the lower resolutions perform noticeably slower using the new Detonator3 drivers while the higher resolutions see a definite improvement. This is most likely due to a more efficient memory bandwidth management present in the new drivers since the biggest difference is made at the higher resolutions in 32-bit color.

All GeForce2 GTS users should definitely upgrade to the newer drivers, unless of course you happen to play in lower resolutions and in 16-bit color.

Under Direct3D games the effect is similar, at lower resolutions there is a performance drop but as you move up the resolution ladder and especially in 32-bit color mode you begin to see a very noticeable and welcome improvement in performance. Since our only Direct3D benchmark, UnrealTournament, isn't memory bandwidth limited, it doesn't show the actual improvement you'll see in Direct3D games which is why we didn't bother to show any UT graphs.

The performance improvement you see here in Quake III Arena, is duplicated in most Direct3D games as well, showing that performance across the board is improved.



The GeForce DDR exhibits results similar to the GeForce2 GTS, at lower resolutions the Detonator3 drivers take a performance hit when compared to the older Detonator2s but at the higher resolutions there is a much more tangible performance boost. At 1024 x 768 x 32 the most improvement is noticeable, which also happens to be the sweet spot for the GeForce DDR.

Under Direct3D there is also a similar performance improvement.



Because the new Detonator3 drivers apparently manage memory better than the older Detonator2 drivers, the GeForce SDR starts receiving a performance boost at 640 x 480 x 32. This makes the drivers a definite must have for GeForce SDR owners, and it also helps to breathe new life into the cards that first bore the GeForce name.

Running at 1024 x 768 x 32 is now much more feasible of an option for GeForce SDR owners, however at all resolutions above that in 32-bit color even faster drivers can't make up for the fact that the GeForce SDR only has 2.7GB/s of peak available memory bandwidth.



The GeForce2 MX is closest to the GeForce SDR in terms of overall performance because they both have the same amount of memory bandwidth (2.7GB/s) this also means that the effects of the Detonator3 drivers on the MX is just as impressive as it was on the GeForce SDR.



While NVIDIA only said the Detonator3 drivers would improve performance on the GeForce based cards, we decided to try it on a TNT2 Ultra. The performance dropped across the board, except for a very small boost at 1024 x 768 x 32.

If you don't own a card with the word GeForce in its name, don't use the new Detonator3 drivers.



Final Words

While we're not too happy that NVIDIA's NV20 won't be coming out until next Spring, the release of the Detonator3 drivers have lessened the blow.

GeForce2 GTS owners should be happy with the new drivers because the card now outperforms the Radeon in almost all situations, even with the Radeon's updated drivers.

GeForce SDR owners should be extremely pleased with the Detonator3 drivers as their extremely memory bandwidth crippled cards that once cost $300 now get a healthy boost in frame rate just by installing some new drivers. Those of you that were lucky enough to hold off until DDR GeForce cards became available get a performance boost as well, and there should be no complaining from that end either.

Even the low-cost GeForce2 MX enjoys a healthy increase in performance since it is also very memory bandwidth limited.

The only users that aren't effected in a positive manner are the TNT/TNT2/TNT2 Ultra owners, sorry guys, but you're out of luck this time around. For everyone else, enjoy the free performance boost.

Log in

Don't have an account? Sign up now