home : reviews :
nForce2 Preview : page 2
What's new in the Northbridge?
Like the nForce, the nForce2 will have two versions of the nForce2 northbridge, the IGP and SPP. Both are identical in every respect except the IGP will have integrated graphics while the SPP will not. Unlike the original nForce, there is no low-end nForce2 northbridge model with single 64bit memory controller. It's a full 128bits throughout. However, NVIDIA still plans to sell the original nForce IGP which does have the option to use a single 64bit memory controller. The nForce1/2 SPP/IGP chipsets are pin-compatible with the nForce2 southbridge. This allows manufacturers to mix chipsets and features for market segments top-to-bottom from enthusiest segment down to value segment.
The memory architecture underlying the nForce northbridge chipset (IGP) was revolutionary in how it improved memory performance. NVIDIA's GeForce3 graphics chipset introduced a crossbar memory architecture called TwinBank - similar to one developed by SGI systems. It works by using two independent 64bit memory controllers (128bits total) that can read and write to memory simultaneously. Also, NVIDIA introduced a dynamic adaptive speculative pre-processor (DASP) which acts like a "hardware prefetch" to reduce memory latency. These features doubled memory performance - at least in theory. A real-world example is to look at Intel's i850 chipset which uses RDRAM in dual-channel mode. Combined with 133Mhz DDR memory in dual-channel mode, the nForce has a total of 4.25GB/s memory bandwidth available to the northbridge. This was the fastest rated memory subsystem for any Athlon system at the time. But in reality, VIA's KT266A somehow performed faster in memory benchmarks according to many online reviews. The nForce2 has a second generation version of TwinBank. This time, NVIDIA gave TwinBank a more palatable name - DualDDR 400. According to NVIDIA, TwinBank is now more compatible and stable with many more types of DDR memory available in the market today than with the original nForce. It now supports officially supports speeds DDR333 (166Mhz bus) and the unofficial DDR400 (200Mhz bus) as well as DDR200 and DDR266. DDR400 is a DDR-1 based signalling method operating which, consequently, is not officially sanctioned by JEDEC. DDR400 is frowned upon due to the standard being deprecated for DDR-II signalling standards in the near future. But as usual, the market demands better and faster memory even if the industry isn't quite ready for it. To prepare for this, NVIDIA is working closely with memory manufacturers such as Samsung, Crucial, and Micron to make sure the nForce2 is compatible with DDR400 memory modules at release time. When in dual-channel mode (two DIMM modules populated) the effective bandwidth is a staggering 6.4GB/s! If you remember back how long it took memory to make the step up from PC66 to PC100 and then to PC133, NVIDIA is really pushing memory performance at a Moore's law pace. Finally we'll see some excitement in this space against Rambus's RDRAM since memory has been one of the long standing PC bottlenecks. Hopefully NVIDIA holds true on the their promise to make compatiblity and performance a non-issue.
NVIDIA's unified driver architecture, top-to-bottom integration, and stable graphics drivers are a perfect fit for integrated graphics. The nForce owns integrated graphics and still does to this day. NVIDIA bought some more leeway with the nForce2 by integrating their GeForce4 MX graphics chipset into the nForce northbridge (IGP) where there was once a GeForce2 MX. The "4" in the name GeForce4 MX is misleading. It's more of a relative to the GeForce2 architecture than it is to the GeForce4. The feature missing is pixel shader support. Pixel shaders are fundamental to graphics in the immediate future and its absence alarmingly contradicts NVIDIA's furious adoption of pixel shaders. I was told by NVIDIA that it was not cost effective and timely enough to justify the switch to GeForce4 (non-MX) architecture in this cycle of the nForce2. You can bet on integrated graphics comparable or exceeding GeForce4 levels next year from the nForce.
The GeForce4 MX was built for mainstream users. The nForce2 can be paired with a DFP/DVI port, a secondary VGA out, or SVGA/Composite out for more flexible viewing options. When paired with TwinView, one can create multi-monitor setups all on-board without a secondary video card. A real-world example would be using the nForce2 as a DVD player or a DivX video player. For software DVD users, the GeForce4 MX's visual processing engine (VPE) includes full hardware-based iDCT and motion compensation acceleration for MPEG video. In comparison, a regular GeForce4 doesn't have iDCT acceleration. The VPE, from what I'm told, is a hold-over from ex-3dfx's Voodoo3/5 video technology. If you want to view your desktop on a television, you wouldn't have to purchase any additional video hardware to output to a television. Final configurations will depend on 3rd party board manufacturers as to what will be deemed available.
(Warning: tangential rant follows) I have mixed feelings about S-Video/Composite out from NVIDIA boards. Video underscans (big black borders around the viewing area) too much and doesn't have the option in the video driver to set overscan. I know it's better not to cut viewing area if you were surfing the web on a televison, but when you want to view a movie, there should be a smart toggle or switch to overscan. This makes viewing DVDs in software like PowerDVD very annoying. My television in particular, pincushions inward due to the huge black borders. ATI RADEON AIW boards have much smaller underscan borders. Until NVIDIA fixes this, I would recommend using a hardware DVD decoder board like Sigma Design's Hollywood Magic or Xcard for DVD playback that's superior in overscanning, contrast, and color to NVIDIA's composite out. Hopefully NVIDIA's video team is reading this. ;)
Also, NVIDIA's Personal Cinema product still can't hook directly to an nForce or nForce2 motherboard. You have to get a GeForce2/4 board equiped with the correct connector. This is frustrating and makes wonder why doesn't NVIDIA integrate the Personal Cinema into their nForce2 which they claim is their digital gateway?
With the upcoming AMD Hammer processor, NVIDIA's IGP and SPP will have a short life-span. I predict it will be totally phased out by Q2-Q3 2003 due to AMD's built in northbridge in their Hammer processor. This reduces complexity on the motherboard, but potentially takes away revenue from NVIDIA. Therefore, by that time, NVIDIA will have an integrated graphics chipset ready for Hammer. Additionally, the focus for NVIDIA will now shift towards their southbridge chipset and an emphasize on multimedia.