Or NV35 as we have come to know it until now is nVidia's latest hardware offering to the masses. Boasting new features such as CineFX 2.0, Intellisample HCT and UltraShadow Technology, nVidia is hoping to blow our socks of with stellar performance and stellar graphics. Can they do it? Right now many of us are unsure since the ATi Radeon 9700 was launched back in September of 2002 and was simply unrivaled in both visual performance and raw horsepower. NVidia countered a few months later with the GeForce FX 5800 series cards, and it soon became evident that NV30 was little if any better than the Radeon 9700 and worse yet, it was very expensive and almost impossible to obtain, and as we now know, the 5800 series cards never really made it to full production and when ATi introduced the Radeon 9800 (R350) the 5800 had already put one foot in the grave and another would soon follow.
NVidia eventually officially announced the demise of NV30 and started dropping hints that production on NV35 would be stepped up and available much sooner than anticipated. That brings us to where we are today with a review of BFG GeForce FX 5900 Ultra 256MB Asylum video card. Lets take a quick look at some of the new features you can expect to see with this card.
This feature is one that nVidia has implemented with the intention of keeping in line with the latest hollywood movie visual effects. In order to bring a richer and more realistic experience to your PC, nVidia introduces the CineFX 2.0 Engine. Here are a few highlights:
"Advanced CineFX features include the support of 1024 instructions in a single rendering pass, allowing for complex effects that aren’t practical in any other architectures. For example, volumetric effects such as smoke, fur, fire and grass add significant depth and realism to a scene, but require multiple instructions to achieve. What CineFX achieves in one rendering pass takes competing products many more. Procedural texture support obviates the need for spending video memory on large texture maps, and allows for subtle, realistic differences across surfaces. Complex lighting can dramatically improve the realism of images, but traditionally adds to rendering time. With the GeForce FX GPUs all of these gorgeous enhancements are possible without sacrificing performance.
In addition, shaders can now handle multiple textures in one pass for optimized execution, making layered or mixed effects such as paint peeling off a metallic surface possible. The CineFX engine allows fetching from up to 16 unique texture maps in a single pixel shader program. These textures can be anything that defines surface or subsurface properties such as bump maps, displacement maps, gloss/specular maps, environment maps, shadow maps and albedo maps."
Sounds good huh? Hopefully we'll start seeing games soon that can take advantage of the CineFX 2.0 Engine and give us gamers something to shout about. Half-Life 2 anyone?
If you've seen the demo movie, this could very well be one of the most anticipated games of the year, and the screen stills do not even do the game justice.
NVIDIA's new compression technique that claims to deliver up to a 50% increase in efficiency for compression color, texture and z-data and powering unprecedented visual quality for resolutions up to 1600 x 1200. We have been right on the cusp of true 1600 x 1200 gaming for awhile now, but will the new cards be able to deliver the goods on a game such as HL2 or D3? I'll reserve judgment until I see it in action. Here's some verbage from the nVidia website regarding Intellisample HCT.
"What's more, the innovative architecture of GeForce FX includes an advanced and completely transparent form of Antialiasing: Jaggy vs. Smooth lossless depth z-buffer and color compression technology. The result? Essentially all modes of antialiasing are available at all resolutions without any performance hit. Greatly improved image quality, with no drop in frame rate!
Intellisample also incorporates the most advanced anisotropic filtering available. When a textured surface is close to edge-on with your viewpoint, the detail and accuracy of that texture drops drastically. NVIDIA's proprietary anisotropic filtering eliminates this distortion adaptively by determining how extensive distortion is likely to be, and applying its filtering muscle proportionally, so you get every last drop of quality and performance possible."
Again, quite a big boast from the nVidia crew. Can they deliver? We shall see. Finally, here are some comments on nVidia's Ultra Shadow Technology.
"UltraShadow gives programmers the ability to calculate shadows much more quickly by eliminating unnecessary areas from consideration. With UltraShadow, programmers can define a bounded portion of the scene (often called depth bounds) that limits calculations of lighting source effects to objects within a specified area. By limiting calculations to the area most affected by a light source, the overall shadow generation process can be greatly accelerated. Programmers can fine-tune shadows within critical regions, create incredible visualizations that effectively mimic reality, and still achieve awesome performance for fast-action games. The accelerated shadow generation can also free up time that can be allocated to other sophisticated but time-consuming effects.
With the power of advanced technologies like NVIDIA UltraShadow, developers can more easily and efficiently translate their artistic visions into compelling scenes that border on reality. Complex shadow effects that employ multiple light sources are now possible at high frame rates, without bogging down gameplay. For unparalleled, cinematic-style environments in your gaming, and stunning, heart-pounding experiences, equip yourself with a GeForce FX 5900 GPU and the power of UltraShadow."
All in all, nVidia has quite a few new goodies up their sleeve with the GF FX 5900 but these are the big three. Let's checkout some "hardware" based improvements we get with this NV35 part.
One of the biggest features that has crippled nVidia based cards over ATi cards is the sheer amount of raw bandwidth the ATi cards have been able to produce. Sporting a 256 bit memory bus, the R9700 and R9800's have been untouchable for the most part. Even the GF FX 5800 was lacking when it came to memory bandwidth. Lets take a quick look at the memory bandwidth specs of some of the more popular cards out there.
nVIdia GF4 MX440 - 6.4GB/sec
SiS Xabre 600 - 10.08GB/sec
nVidia GF4 Ti4200 - 8.0GB/sec
nVidia GF4 Ti4600 - 10.4GB/sec
nVidia GF FX 5600 Ultra - 12GB/sec
nVidia GF FX 5800 Ultra - 16GB/sec
ATI Radeon™9700 Pro Stock - 19.8GB/sec
ATI Radeon™9800 Pro Stock - 21.8GB/sec
nVidia GeForce FX 5900 Ultra - 27.2GB/sec
So we see nVidia has finally come around and implemented the 256-bit memory bus. This gives the 5900 Ultra more bandwidth and an overall performance increase.
Additionally, the NV35 GPU is manufactured at a .13micron process, is DX9 native and capable of implementing 256MB of DDR memory. Curiously gone is the DDR II that we saw with the GF FX 5800 Ultra series cards. Rumors of expensive production and low yields may have set nVidia back to the drawing board but apparently hasn't affected ATi as they have recently introduced the Radeon 9800 Pro version with 256MB of DDR II.
Ok, now that we have most of the technical aspects of the card covered, lets move on.
KEITHLEE2/home/servers/www.devhardware.com/www/zdeconfigurator/configs/INFUSIONSOFT_OVERLAY.php/home/servers/www.devhardware.com/www/zdeconfigurator/configs/ OFFLOADING INFUSIONSOFTLOADING INFUSIONSOFT 1debug:overlay status: OFF overlay not displayed overlay cookie defined: TI_CAMPAIGN_1012_D OVERLAY COOKIE set: status off