Current Pricing

 

 

GeForceFX Preview
By : Wayne Brooker, 18th November 2002

Introduction :

It's been a long time since NVIDIA played second fiddle to a competitor when it comes to raw graphics power and I dare say there was no shortage of anxious meetings and sweaty palms at their Santa Clara headquarters as ATi's Radeon 9700 Pro swept the rug from under their feet in one fell swoop. This anxiety was probably easier to stomach as they witnessed the coming together of their much anticipated NV30 GPU/VPU and although they'd have been happier seeing it make it to final silicon much sooner than it actually did there must now be a certain satisfaction in knowing they've weathered a storm that their rivals have yet to tackle, that being the move to a 0.13micron process.

NVIDIA have today officially taken the wraps of their NV30 which we should perhaps now think of as NV3X as it represents just one in a series of discrete products aimed at once again regaining the graphics crown. And the name?........GeForceFX!

Anybody else disappointed with the name? There's merit in the idea of sticking with a winning formula but the GeForce name has been with us four generations now and with NVIDIA's claims that this new GPU was to be a ground-up core redesign and a radically different product it seems to me the time was right to ditch the aging GeForce moniker. Apparently the "FX" part of the name is in tribute to the 3dfx engineers who were involved in the product, and to acknowledge the fact that this is the first GPU to feature an element of 3dfx intellectual property. If you're wondering just what that intellectual property is then like me you'll have to guess (FSAA?) because NVIDIA's lips were clamped tighter than a bugle player reaching for a high note!

The GeForceFX GPU is a radical change to what went before it in many ways, all of which I hope to at least touch on in this preview, but undoubtedly the angle that NVIDIA are taking is that GeForceFX is a massive step towards real-time cinematic quality 3D rendering both in terms of its power and its precision. Forget Moore's Law when it comes to 3d hardware evolution, we're talking Moore's law cubed! Of course with no hardware available to test yet I have to confess to being little more than a mouthpiece for NVIDIA at this stage, and the claims and assumptions I'll make are based on nothing more than technical papers and fairly limited discussions. The proof of the pudding comes in the eating but alas today we're simply reading the recipe. Time to see what GeForceFX has to offer the next generation of game and gamer :

The Convergence of Film and Real-time Rendering :

Remember Virtua Fighter on the NV1? I do, and it blew me away! The card I ran at that time (1995-6) was the NV1 powered Diamond Edge3D which actually shipped with Virtua Fighter ported directly from the Sega Megadrive and provided me with hours of cutting edge entertainment, and believe me those edges were sharp enough to cut you!

Looking back at it now however you really do see the limitations of its 50 thousand polygon/second and 1million pixel ops/second architecture.

That was then and I hardly need to tell you that things have moved on considerably since those first faltering steps into the world of accelerated 3D. Several companies played their part in the development of 3D graphics from a novelty niche market to a massive multi-billion dollar industry. Firstly there was 3DFX (later 3dfx) who very much kickstarted mainstream interest in 3d for the home PC. By offering powerful graphics cards at sensible prices targeted directly at gamers, and with industry wide support for their proprietary GLiDE API Voodoo became the name on every respectable gamer's lips. Although several innovations followed Voodoo the next real milestone in 3D Graphics came with the introduction of hardware T&L (Transform and Lighting) as introduced by NVIDIA with their first GeForce cards. The eventual demise of 3dfx left only three major names in the graphics industry and one of these, Matrox, had decided to concentrate their efforts primarily on the business sector with what was undoubtedly some of the best 2D quality available anywhere. The 3D side of the market was left to NVIDIA and ATi to squabble over and while both were churning out some incredible products it was always NVIDIA who had their nose ahead when it came to performance. I'm not going to get in to the whole image quality issue here as it's not in my brief, the fact is that in terms of framerates NVIDIA was just about untouchable.

R300 changed all that and since July of this year ATi have held all the cards (excuse the pun). Meanwhile NVIDIA was struggling with one of the biggest technical challenges they've faced to date, the move from a 0.15micron process to a 0.13micron process using copper interconnects. This endeavor was apparently the only reason behind NV3x's delay to market and although it left them in the unfamiliar position of underdog they felt the time was right to tackle the move head on and get it behind them. The deed now complete they have emerged faster, leaner and ready to ramp up those speeds.

But surely the struggle to adapt to the new 0.13micron process and the lessons learned from it will benefit ATi too? I asked NVIDIA's Adam Foat just how much ATi stand to gain from NVIDIA's trials and tribulations and though he admitted that some of the lessons learned by TSMC would serve to ease the competitions' transition the vast majority of the technical difficulties were related to porting the architecture over to the new process and so won't directly benefit others facing the same proposition.

 

 

 

 

 


The Evolution of Real-Time 3D Rendering

 

 

The NV3X GPU

At the heart of the GeForceFX is the NV3X GPU and what a monster it is. Bedecked with some 125million transistors this compares to just 55 million for the Northwood core P4 or 37.2 million for AMD's Thoroughbred!

To further streamline the GPU they've also decided to use a flip chip package design, currently the only graphics company to do so. This fairly minor step is actually a massive undertaking involving some serious design challenges like wire length precision and I/O logic placement and I/O bus balancing, tasks that are often required to be done manually at the design stage due to the inadequacies of most commercially available automated design tools.

Even at this stage we don't know a whole lot about NV3X. Final core speeds either weren't confirmed or weren't being shared at the time of this preview but we can certainly expect speeds over 400MHz and probably nearer 500MHz.

 

A Look At The Features

AGP 8X :

At this stage of the game probably the most expected and least exciting of NV3X's features is the AGP 8X interface. I say "at this stage" because we've not really had a chance to see it in action running software designed to take advantage of it yet. Its inclusion in the GeForceFX specs is pretty much compulsory if it's to take advantage of future software titles, particularly games as there is at least some support for 8X at the professional end of the software market.

As we've discussed before the 8X interface doubles the previous 1.1GB/sec bandwidth to 2.1GB/sec primarily by doubling the clock speed. Not exactly a deal breaker today but a thoretically big feature to have for tomorrow!

Before we move on here's a table that directly compares the GeForceFX to the GeForce Ti4x00 series (NV25) GPU.

<<< Page 2 - Features Continued >>>

 

Home