Geforce 2 ultra

Geforce 2 ultra DEFAULT

Review: Elsa Geforce 2 Ultra

Elsa Geforce 2 Ultra

Elsa have a long tradition of making high performance graphics cards. I reviewed some of their other GeForce2 offerings back in late , here I'm looking at the GeForce2 Ultra, the fastest GeForce2 series card that is made. It has 64MB of ns DDR memory running at MHz DDR, backed up by the GeForce2 Core running at MHz, and all packaged together with some fairly large green heatsinks, there is unfortunately no TV Output as standard on the Gladiac Ultra, although this is available, on an optional TV out module. There are lots of articles on the GeForce2 core architecture and T&L, already out there on the net, so here I'll be just looking at how it all works.

Test Rig

Soltek SL KAV KTA chipset
MB Crucial PC RAM CAS 2
AMD Athlon T-Bird @ MHz
IBM GXP 75 30 gig
Soundblaster Live
Adaptec SCSI Card
Plextor 40 Speed SCSI CD ROM
19" CTX monitor
Microsoft Intellimouse Explorer Optical
Microsoft Natural Keyboard

Running Win98 SE with Via Service Pack a, AGP 4* enabled MB AGP Aperture Nvidia Detonator

The Hardware

Well that's enough of the greenery lets see how well it performs.

The Specs

NVIDIA GeForce2 Ultra, processor speed: MHz
Memory outfitMB DDR SDRAM, 4 ns access time, MHz effective memory cycle
RAMDAC/pixel cycle MHz
BIOSVESA-BIOSSupport
Bus-SystemAGP2x/4x, Fast Writes
Features
Graphics standardsDirect3D, OpenGL, DirectX7, DirectDraw, DirectShow
2D featuresbit 2D acceleration, optimized for 32, 24, 16, 15, and 8-bit color depths, hardware cursor in TrueColor, multi-buffering (2x, 3x and 4x for flowing movement and video playback)
3D Featuresbit engine with HyperTexel architecture, optimized Direct3D and OpenGL acceleration, full DirectX7 support, bit Z and stencil buffer, single-pass multi-texturing, anti-aliasing, high-quality texture filtering, including anisotropic; advanced per-pixel texturing for perspective correction, fog and depth cueing, texture compression
HDTV- und DVD-PlaybackExtended motion compensation for full screen video playback in all DVD and
HDTV resolutions, video acceleration for MPEG-1, MPEG-2 and Indeo
TV-out (optional)Connection for S video and Composite/FBAS (with included adapter);
PAL (max. x pixels), NTSC (max. x pixels)
StandardsDPMS, DDC2B, Plug&Play
ConnectorsMonitor: VGA-D-Shell (15polig)
TV-out (optional): mini DIN (4-pin), Cinch using included adapter
Dimensions x mm (ATX format, not including mounting bracket)

Installation

As always I installed the graphics card along with a fresh install of Windows. The GeForce graphics cards all use the same drivers, having just been using Detonator with a GeForce3, I chose to use the same drivers for this review, Installation was simply a matter of putting the card in and then browsing to the correct directory. Directx 8 was also installed to run the Directx 8 benchmarks.

Control Panels

Benchmarks

As this is the first GeForce2 card that I've reviewed in a while it is somewhat tricky to decide, what I should use as a baseline. I decided on comparing it with some different cards, the GeForce2 MX which I reviewed a while ago, the Kyro II and GeForce3 for some of the benchmarks. This maybe not be a particularly fair comparison, but it does show how the Ultra stacks up against currently available hardware

Vulpine GL Mark

Vulpine GL Mark is a free download from, http://www.vulpine.de/ its around 39MB. The benchmark takes quite a while to run, but it does run some pretty nifty graphics. The benchmark also has options specifically for the GeForce3, but these are greyed out when a GeForce2 is installed, so I ran the default benchmark and chose full screen setting.

**32bit Full screen Standard OpenGL , Texture Compression on, High Detail

Vulpine GLMarkUltra
min19
max
average

GLMark obviously stresses the Geforce2 Ultra out quite a bit, the minimum framerate gets down to 19FPS. The Ultra is the fastest card around for this particular benchmark besides the GeForce3 which is quite a bit faster.

Quake 3

Here's the obligatory Quake 3 benchmarks, this time showing the performance against some widely available cards, namely the Kyro II, GeForce2 MX, GeForce2 Ultra and the latest the GeForce3

Quake 3 Demo 1*****
EHQ KYRO2
EHQ MX
EHQ Ultra
EHQ GF3

Well there you have it the performance of a few of the currently available cards, as you can see the GeForce2 Ultra in green on the graph is only just behind the GeForce3, the GeForce2 MX is trailing in the higher resolutions. The Kyro II puts up a good performance against the other cards but ultimately the GeForce3 wins.

3DMark

This is the benchmark everyone's talking about this year, although again its got some GeForce3 only features, it give any graphics card a good workout. The Lobby scene in 3Dmark features the same game engine and graphics as Max Payne a game that's due out later this year. So any way without further ado here are the results.

3DMark *****
GF3
GF2 Ultra
Kyro II

All 3 of the graphics card where benched the same at MHz, the nature test was disabled for the GeForce3 as this counts towards the final score, the Kyro II and GeForce2 Ultra cannot run the Nature test so it is skipped by default. As you can see the GeForce3 pulls ahead, the Kyro II gets left in the dust by both the GeForce's.

Unreal Tournament

I ran Unreal Tournament, and benchmarked with the Thunder Demo, all detail settings where on high in 32 bit running in D3D mode.

Unreal Tournament****
GeForce2 Ultra

Basically with the GeForce Ultra you can play UT in any resolution you want, it works fine with all the settings maxed out on this GHz PC.

2D Performance

I had no problems with 2D performance, including DVD and DivX playback, 2D performance as far as I'm concerned is very good on all the current Nvidia GeForce based cards, and will allow you to run any application you should wish to.

Video Mark

Video Mark

GeForce2 Ultra

Video Mark
Quality
Performance
Features

Overclocking

The GeForce Ultra is already clocked at MHz core MHz memory so its already running pretty darn fast at default settings. There is always the option to tweak the performance a bit more with the use of "coolbits" reg file, it allows you to adjust the speed of the core and memory. I'm sure everyone reading this is familiar with coolbits by now so I wont bore you with a screen shot. I managed to get the Elsa up to Core and memory. At those settings Quake 3 would run but there where quite a few graphics glitches on the screen, when I tried 3DMark , it just locked up straight away, I backed the card down and it found that it ran stably at MHz Core MHz mem.

I ran 3DMark again with the card overclocked to show the gains that can be made, although not much, the extra performance is a free upgrade, which cant be bad (O:

3DMark OC*****
GeForce 2 Ultra
GeForce 2 Ultra OC

Conclusion

The Elsa Gladiac GeForce2 Ultra, is a very good gaming card, it runs pretty much every current game with the detail settings maxed out, obviously it's lacking the features of its new big Brother the Gladiac GeForce3, but the Ultra sells at a much more reasonable price. As such its definitely a good buy. Since it was released the GeForce2 Ultra has dropped considerably in price, as Elsa cards are already priced very competitively, now is probably a very good time to upgrade from that old Voodoo3, that just isn't up to the job these days. The GeForce2 Ultra leaves the MX based cards standing in terms of high resolution gaming, and as the 32MB GeForce2 GTS is becoming increasingly rare the Ultra is the card in the middle sandwiched between the MX and the GeForce3. Right now there are very few games that support the GeForce3 features, in a years time the story will be very different, but right now the Gladiac Ultra will play just about all the games and run all the applications that you could possibly need.

If you crave the ultimate in speed then look at the Elsa Gladiac , as it is one of the best priced GeForce3's around, if just plain very fast will do then make sure you look at this card the Gladiac Ultra. With street prices around £ for the Gladiac Ultra, it is around £80 less than the Gladiac GeForce3, the decision is yours.

Sours: https://m.hexus.net/tech/reviews/graphics/elsa-geforceultra/

NVIDIA Strikes Back - The GeForce2 Ultra 3D Monster

The GeForce2 Ultra Card

This funky piece of expensive hardware represents NVIDIA's new GeForce2 Ultra reference design. Please note that it's obviously deranged from the Quadro2-design, although it's a tiny bit bigger, mainly due to the larger power supply of GeForce2 Ultra.

The card comes with the following specs:

  • Core Clock MHz

  • (GeForce2 GTS has a core clock of MHz)
  • Memory Clock MHz

  • (GeForce2 GTS runs the memory only at MHz)

This is good for a theoretical 1 GPixel/s pixel fill rate and 2 GPixel/s texel fill rate. The memory bandwidth is a whopping GB/s, which marks an increase of 38% over the GeForce2 GTS. The other features are, as mentioned above, identical to GeForce2 GTS.

The shaggadelicly nice green heat sinks cover the most important feature of GeForce2 Ultra, the memory. We removed one of the heat sinks to find the following chip underneath:

A four nano seconds (4ns) rated DDR SDRAM chip from ESMT. This chip is therefore rated for no less than MHz or ' MHz DDR'. This is obviously a bit surprising, since it means that this memory is a bit 'too good' for only MHz operation. Our overclocking checks proved that point as the cards memory went indeed up to MHz clock without problems, offering some 8 GB/s of memory bandwidth then. The chip was also not too impressed with the MHz it's meant to run at by default. We got it up to MHz, which unfortunately still doesn't mean that much, because even a memory bandwidth of 8 GB/s is still not quite good enough to feed the big data hunger of a GeForce2 chip running at MHz.

NVIDIA told us that it wasn't easy at all to get all the memory together for a reasonable launch of the GeForce2 Ultra. Therefore some cards might come with memory rated at ns and the manufacturers will most likely vary as well. I guess that this is the reason why NVIDIA preferred to stay on the safe side and clocked the memory at 'only' MHz.

The New Detonator 3 Drivers

So far about the hardware behind the new GeForce2 Ultra. There is something else that is particularly important for the excellent performance of this new 3D solution. NVIDIA's driver team under Dwight "I never sleep and my home is my office" Diercks is today the best 3D driver team that exists. Period!

After NVIDIA had to find out that the air is getting a bit thin with ATi up there in the high-res/true-color heights, the efforts were increased, Dwight and his team disregarded such a wasteful thing as sleep (that's my opinion too) completely and the Detonator 3 driver set was born.

Silvino is doing a dedicated article on this topic, so that I won't tell you too much now. What I can say however is that this new driver set improves the performance of all supported chips (from TNT right up to GeForce2 Ultra) in a tremendous fashion.

If there is one complaint then it is the missing Linux Detonator 3 driver. Dwight promised me this driver for the end of the month.

Test Setup

Graphics Cards and Drivers
Radeon DDR 64 MB
GeForce2 UltraGeForce2 GTS 64 MBGeForce2 GTSGeForce2 MXGeForce DDR 32 MBGeForce SDRRIVA TNT2 UltraRIVA TNT2RIVA TNT
Voodoo5
Platform Information
CPUIntel Pentium III 1 GHz
MotherboardAsus CUSL2 (bios BETA 02)
MemoryWichmann WorkX PC CAS2, setting /7
NetworkNetgear FATX
Environment Settings
OS VersionWindows 98 SE A
DirectX Version
Quake 3 ArenaRetail versioncommand line = +set cd_nocd 1 +set s_initsound 0OpenGL FSAA set to 2x SuperSampling or FSAAQuality 1
ExpendableDownloadable Demo Versioncommand line = -timedemoD3D FSAA set to 4x SuperSampling
EvolvaRolling Demo v Build Standard command line = -benchmarkBump Mapped command line = -benchmark -dotbump
MDK2Downloadable Demo VersionT&L = Ontrilinear, high texture detail
Sours: https://www.tomshardware.com/reviews/nvidia-strikes-back,html
  1. Fisher paykel refrigerator
  2. Nashville tn gis
  3. Walmart ironing board
  4. Bose wifi speakers
  5. Cool sketch ideas

GeForce 2 series

For GeForce cards with a model number of 2X0, see GeForce series. For GeForce cards with a model number of 20X0, see GeForce 20 series.

Series of GPUs by Nvidia

Geforce2logo.png
NVIDIA@nm@Fixed-pipeline@NV15@GeForce2 GTS@F A3 DSC ().jpg

Top: Logo of the GeForce 2 series
Bottom: Nvidia GeForce 2 GTS with its cooler removed, showing the NV15 die

Release datemid-May, ; 21&#;years ago&#;()[1]
CodenameNV11, NV15, NV16
ArchitectureCelsius
Models
  • GeForce MX series
  • GeForce GTS series
  • GeForce Pro series
  • GeForce Ti series
  • GeForce Ultra series
Entry-levelMX
Mid-rangeGTS, Pro
High-endTi, Ultra
Direct3DDirect3D
OpenGLOpenGL (T&L)
PredecessorGeForce
SuccessorGeForce 3 series

The GeForce 2 series (NV15) is the second generation of Nvidia's GeForcegraphics processing units (GPUs). Introduced in , it is the successor to the GeForce

The GeForce 2 family comprised a number of models: GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ultra, GeForce 2 Ti, GeForce 2 Go and the GeForce 2 MX series. In addition, the GeForce 2 architecture is used for the Quadro series on the Quadro 2 Pro, 2 MXR, and 2 EX cards with special drivers meant to accelerate computer-aided design applications.

Architecture[edit]

Die shot of a Geforce 2 GPU

The GeForce 2 architecture is similar to the previous GeForce line but with various improvements. Compared to the nm GeForce , GeForce 2 is built on a &#;nm manufacturing process, making the silicon more dense and allowing for more transistors and a higher clock speed. The most significant change for 3D acceleration is the addition of a second texture mapping unit to each of the four pixel pipelines. Some say[who?] the second TMU was there in the original Geforce NSR (NVIDIA Shading Rasterizer) but dual-texturing was disabled due to a hardware bug; NSR's unique ability to do single-cycle trilinear texture filtering supports this suggestion. This doubles the texture fillrate per clock compared to the previous generation and is the reasoning behind the GeForce 2 GTS's naming suffix: GigaTexel Shader (GTS). The GeForce 2 also formally introduces the NSR (Nvidia Shading Rasterizer), a primitive type of programmable pixel pipeline that is somewhat similar to later pixel shaders. This functionality is also present in GeForce but was unpublicized. Another hardware enhancement is an upgraded video processing pipeline, called HDVP (high definition video processor). HDVP supports motion video playback at HDTV-resolutions ([email protected]).[2]

In 3D benchmarks and gaming applications, the GeForce 2 GTS outperforms its predecessor by up to 40%.[3] In OpenGL games (such as Quake III), the card outperforms the ATI Radeon DDR and 3dfxVoodoo 5 cards in both 16 bpp and 32 bpp display modes. However, in Direct3D games running 32 bpp, the Radeon DDR is sometimes able to take the lead.[4]

The GeForce 2 architecture is quite memory bandwidth constrained.[5] The GPU wastes memory bandwidth and pixel fillrate due to unoptimized z-buffer usage, drawing of hidden surfaces, and a relatively inefficient RAM controller. The main competition for GeForce 2, the ATI Radeon DDR, has hardware functions (called HyperZ) that address these issues.[6] Because of the inefficient nature of the GeForce 2 GPUs, they could not approach their theoretical performance potential and the Radeon, even with its significantly less powerful 3D architecture, offered strong competition. The later NV17 revision of the NV11 design, used for the GeForce 4 MX, was more efficient.

Releases[edit]

The first models to arrive after the original GeForce 2 GTS was the GeForce 2 Ultra and GeForce2 MX, launched on September 7, [7] On September 29, Nvidia started shipping graphics cards which had 16 and 32 MB of video memory size.

Architecturally identical to the GTS, the Ultra simply has higher core and memory clock rates. The Ultra model actually outperforms the first GeForce 3 products in some cases, due to initial GeForce 3 cards having significantly lower fillrate. However, the Ultra loses its lead when anti-aliasing is enabled, because of the GeForce 3's new memory bandwidth/fillrate efficiency mechanisms; plus the GeForce 3 has a superior next-generation feature set with programmable vertex and pixel shaders for DirectX games.

The GeForce 2 Pro, introduced shortly after the Ultra, was an alternative to the expensive top-line Ultra and is faster than the GTS.

In October , the GeForce 2 Ti was positioned as a cheaper and less advanced alternative to the GeForce 3. Faster than the GTS and Pro but slower than the Ultra, the GeForce 2 Ti performed competitively against the Radeon , although the had the advantage of dual-display support. This mid-range GeForce 2 release was replaced by the GeForce 4 MX series as the budget/performance choice in January

On their product web page, Nvidia initially placed the Ultra as a separate offering from the rest of the GeForce 2 lineup (GTS, Pro, Ti), however by late with the GeForce 2 considered a discontinued product line, the Ultra was included along the GTS, Pro, and Ti in the GeForce 2 information page.

GeForce 2 MX[edit]

Die shot of the MX GPU

Since the previous GeForce line shipped without a budget variant, the RIVA TNT2 series was left to fill the "low-end" role—albeit with a comparably obsolete feature set. In order to create a better low-end option, NVIDIA created the GeForce 2 MX series, which offered a set of standard features, specific to the entire GeForce 2 generation, limited only by categorical tier. The GeForce 2 MX cards had two 3D pixel pipelines removed and a reduced available memory bandwidth. The cards utilized either SDR SDRAM or DDR SDRAM with memory bus widths ranging from bit to bits, allowing circuit board cost to be varied. The MX series also provided dual-display support, something not found in the regular GeForce and GeForce 2.

The prime competitors to the GeForce 2 MX series were ATI's Radeon VE / and Radeon SDR (which with the other R's was later renamed as part of the series). The Radeon VE had the advantage of somewhat better dual-monitor display software, but it did not offer hardware T&L, an emerging 3D rendering feature of the day that was the major attraction of Direct3D 7. Further, the Radeon VE featured only a single rendering pipeline, causing it to produce a substantially lower fillrate than the GeForce 2 MX. The Radeon SDR, equipped with SDR SDRAM instead of DDR SDRAM found in more expensive brethren, was released some time later, and exhibited faster bit 3D rendering than the GeForce 2 MX.[8] However, the Radeon SDR lacked multi-monitor support and debuted at a considerable higher price point than the GeForce 2 MX. 3dfx's Voodoo4 arrived too late, as well as being too expensive, but too slow to compete with the GeForce 2 MX.

Members of the series include GeForce 2 MX, MX, MX, and MX. The GPU was also used as an integrated graphics processor in the nForce chipset line and as a mobile graphics chip for notebooks called GeForce 2 Go.

Successor[edit]

The successor to the GeForce 2 (non-MX) line is the GeForce 3. The non-MX GeForce 2 line was reduced in price and saw the addition of the GeForce 2 Ti, in order to offer a mid-range alternative to the high-end GeForce 3 product.

Later, the entire GeForce 2 line was replaced with the GeForce 4 MX.

Models[edit]

Main article: List of Nvidia graphics processing units

Support[edit]

Nvidia has ceased driver support for GeForce 2 series, ending with GTS, Pro, Ti and Ultra models in and then with MX models in

GeForce 2 GTS, GeForce 2 Pro, GeForce 2 Ti and GeForce 2 Ultra:

  • Windows 9x & Windows Me: released on March 11, ; Download;
Product Support List Windows 95/98/Me –
  • Windows & bit Windows XP: released on April 14, ; Download;
Product Support List Windows XP/ -
  • Linux bit: released on August 17, ; Download;


GeForce 2 MX & MX x00 Series:

  • Windows 9x & Windows Me: released on December 21, ; Download;
Product Support List Windows 95/98/Me –
  • Windows , bit Windows XP & Media Center Edition: released on November 2, ; Download.
(Products supported list also on this page)

Windows 95/98/Me Driver Archive
Windows XP/ Driver Archive

  • Driver version for Windows 9x/Me was the last driver version ever released by Nvidia for these systems. No new official releases were later made for these systems.
  • For Windows , bit Windows XP & Media Center Edition also available beta driver released on November 28, ; ForceWare Release 90 Version - BETA.
  • Linux bit: released on September 14, ; Download;

Competing chipsets[edit]

See also[edit]

References[edit]

  1. ^Ross, Alex (April 26, ). "NVIDIA GeForce2 GTS Guide". SharkyExtreme.
  2. ^Lal Shimpi, Anand (April 26, ). "NVIDIA GeForce 2 GTS". Anandtech. p.&#;2. Retrieved July 2,
  3. ^Lal Shimpi, Anand (April 26, ). "NVIDIA GeForce 2 GTS". Anandtech. Retrieved June 14,
  4. ^Witheiler, Matthew (July 17, ). "ATI Radeon 64MB DDR". Anandtech. Retrieved June 14,
  5. ^Lal Shimpi, Anand (August 14, ). "NVIDIA GeForce 2 Ultra". Anandtech. Retrieved June 14,
  6. ^Lal Shimpi, Anand (April 25, ). "ATI Radeon Preview (HyperZ)". Anandtech. p.&#;5. Retrieved June 14,
  7. ^"Press Release-NVIDIA". www.nvidia.com. Retrieved April 22,
  8. ^FastSite (December 27, ). "ATI RADEON 32MB SDR Review". X-bit labs. Archived from the original on July 25, Retrieved June 14,

External links[edit]

Sours: https://en.wikipedia.org/wiki/GeForce_2_series

By Andrey Vorobiev

Looking at this revolutionary product, I want to come back to the autumn and compare it with the anolog NVIDIA GeForce videocard which still has SDR memory. Those, who were lucky to buy it, got disappointed a bit later, when there were going to appear similar videocards but with DDR memory which were much quicker especially in bit color mode. And a year ago NVIDIA GeForce cards with DDR started to sell. In spite of the fact that those card used the same graphics processor, a limited bandwidth of SDR were so obvious, that graphics subsystems on the base of DDR were much better in terms of performance.

A release of GPU NVIDIA GeForce was a sign of the fact that NVIDIA started to apply new technologies, i.e. HW T&L, DDR memory. This year a leader in performance among GPU is GeForce2 GTS series and its versions: GeForce2 GTS, GeForce2 Pro and GeForce2 Ultra. This revolutionary products yet have a bottleneck - DDR limited bandwidth. To make a new step, it's necessary to choose a right strategy. And it's better to change rendering architecture and this will be a way to weaken the requirements to the memory bandwidth. There are a lot of algorithms which allow to solve the problem of the bandwidth but still using a polygon rendering architecture.

Besides, there is another way to solve the problem: multiprocessor graphics solutions, i.e. Voodoo5 or Voodoo5 from 3dfx. However, in this case you might face many other problems. For example, it's quite difficult to provide a well co-ordinated work of 2 and more graphics processors. Besides, there will be difficulty in compatibility with OS and mainboards. Do you remember Rage MAXX from ATI? It doesn't work under Windows

Sometimes, NVIDIA is reproached with the fact that their new products don't differ much from their elder fellows. And that's true. For example, GeForce2 series; it was released a half year later after GeForce had showed that partial renewal of the product is advantageous for nobody. So, it's much more profitable to release new revolutionary products rear. The example of GeForce2 and GeForce MX shows that NVIDIA offered newest graphics accelerators with HW T&L support in all sectors from Low-End to Hi-End, professional OpenGL solutions (Quadro, Quadro2, MXR), and soon it will start to occupy the market for mobile computers and integrated chipsets for mainboards (Crush series). In general, NVIDIA still remains a leader in the market. Mass sales of GeForce2 Ultra based cards prove it. When a competitor ATI Radeon started to beat GeForce2 GTS, NVIDIA has announced a first in the market - GeForce2 Ultra, which left Radeon behind. Later, when analyzing test results, we'll be able to talk about it in depth.

Today we are going to consider Creative 3D Blaster Annihilator2 Ultra based on GPU GeForce2 Ultra, which belongs to High-End class and is intended for cool gamers. The price of the card corresponds to its class. Today most vendors don't develop their own PCB design, they just use ready-made designs from NVIDIA. Besides, NVIDIA delivers its graphics processors to vendors together with a memory. And you won't be able anymore to tell the difference between NVIDIA based cards from different manufacturers.

Specification

Below you can see the features of a new chipset NVIDIA GeForce2 Ultra:

  • micron technology
  • MHz graphics core working frequency
  • 4 rendering pipelines, with 2 texturing blocks each
  • () MHz memory bus working frequency
  • The supported memory types include: DDR SDRAM
  • 64MB local graphics memory
  • GB/sec memory bus bandwidth
  • Pixel Fillrate: million pixels per second
  • Texel Fillrate:
    • 1 texel per 1 Pixel million texels per second
    • 2 texel per 1 Pixel million texels per second
  • RAMDAC: MHz
  • Max display resolution: [email protected]
  • Integrated TMDS transmitter allows connecting digital displays and support resolutions up tox
  • External bus interface: full AGP 4x/2x (including Fast Writes, SBA, DME) and PCI (including Bus mastering).

3D Graphics:

  • Geometric GeForce 2 GTS engine (transformation of coordinates, lighting, clipping)
  • HW T&L capacity (peak): 30 million textured polygons triangles per second
  • 8 source hardware lighting for the whole scene
  • Full OpenGL and DX7 support - Transform and Lighting, Cube environment mapping, projective textures and texture compression
  • NSR engine supports Pixel Shaders
  • Support for Full-scene hardware anti-aliasing, HW FSAA
  • Support for Motion Blur, Depth of Field via D3D8
  • Full support of DXTC and S3TC via DX and OpenGL correspondingly
  • Hardware support for vertex blending with usage of 2 skinning matrices
  • Rendering in 16bit and 32bit color mode
  • Hardware support for Embosing, Dot Product3
  • Support textures up to x @ 32 bit
  • Programmed modes of texture blending
  • 8-bit stencil buffer
  • 16/24/32 bit Z-buffer
  • Full support for OpenGL under OS Linux

Video:

  • Hardware decoding of all HDTV formats
  • Supports all resolutions up to ATSC including i
  • Supports 8-bit alpha blending
  • Supports VIP port of I level (8 bit, 75 MHz), it allows connecting external MPEG2 decoders/coders.
  • GeForce 2 GTS chip has an integrated digital interface for TV coders.

One of the major features of GeForce2 GTS/Pro/Ultra is 4 rendering pipelines, each with 2 texturing blocks. It allows to gain performance in games with multitexturing support.

Besides, Ultra offers more fillrate value as compared with the previous GPU of GeForce series. If NVIDIA GeForce has it as million pixels/s, NVIDIA GeForce2 GTS as million, then, GeForce2 Ultra will give you 1 Gpixel/s. That's why it must work much faster.

Moreover, it supports real trilinear filtering. Quality of anisotropic filtering is improved thanks to 16 textured samples.

Well, let's consider some examples.

Below you can see a screen shot and the part of it with bilinear filtering in use:

And this is a trilinear filtering:

And now anisotropic filtering:

Yes, anisotropic filtering has proved to be the best. But it can't replace a trilinear one. In theory, both filterings can be used simultaneously.

Note that NVIDIA drivers are written in that way that in OpenGL only one filtering can be used, and in Direct3D there might be all 3 filterings at the same time, and their activation depends on an application. For example, in Unreal Tournament settings in Direct3D mode there is the following option: UseTrilinear = False/True.

Changing this parameter, we'll get either bilinear:

or trilinear filtering:

Unfortunately, Unreal Tournament doesn't "know" anisotropic filtering.

NVIDIA Shading Rasterizer

A welcoming addition of this chipset is NVIDIA Shading Rasterizer (NSR). GPU GeForce2 can implement multitexturing, that is it can lay more than one texture on a pixel at a time step implementing texture blending operations. In principle, GeForce can do it also, but GeForce2 has a wider set of these operations. To make possible multitexturing effects the control of texturing blocks will be realized via Pixel Shader API, which will be a part of DirectX Below you can see the effects of per-pixel lightening and shading.

Pixel Shader and NSR change the conception of multitexturing, that's why developers who want to use these technology have to develop a new module for its realization.

Cube texturing

As you can see from the characteristics, NVIDIA GeForce2 Ultra supports Cube environment mapping:

This method makes possible to reflect realistically all what you can see around, and first of all - water surface. The screenshots shows the waves which are spreading off in real time mode when you touch the surface with a mouse cursor.

There are some engines already available in the market which can use all the innovations of 3D-graphics, i.e. Crytek X-Isle:

Note that there are not only demo-versions of such games, but there already appeared new games of a new generation, i.e. Real MYST, which use many modern effects, such as HW T&L and others.

Now let's turn to the card itself based on GeForce2 Ultra.

The board

Note, that 3D Blaster Annihilator2 Ultra is a series card from Creative. The card has AGP x2/x4 interface, 64 Mb DDR SDRAM located in 8 chips on the right side of the PCB. The chips are located behind radiators, that's why we failed to define the manufacturer of the modules. In the Net we found out that they are from EliteMT 4ns. It means that the board has a very fast memory which can work at the regular frequency of MHz (or resulting frequency of MHz). Though the working frequency of this card was set to MHz ().

NVIDIA GeForce2 GTS based videocards has MHz () memory working frequency. And in this case the memory bandwidth is unnecessary for stable work of the graphics core. If we try to calculate the disbalance in figures (a ratio of graphics core and local videomemory frequency), we'll get: for GeForce2 GTS it equals / = GeForce DDR has the best ratio: / = For GeForce2 Ultra we have / = Yes, it's not very much. As we can judge, GeForce2 Ultra potential will be limited by memory bandwidth, despite the fact that there is no more faster memory. Interestingly, a cheaper GeForce2 Pro based card has it as /=2.

Overclocking

The sample we have is quite promising. We managed to get a stable work of the graphics core up to MHz (at MHz it wasn't so stable), the memory worked stable only up to MHz ( MHz).

Other features

The card has a jack for a daughter card with a TV-out. It allows to realize a universal connector for usage together with this card different daughter boards with TV-out, TV-in or TV-in/out. In result, changing the design, the manufacturers can produce various modifications, and moreover - even a multimedia combine. In the present case, the connector isn't used. Creative doesn't plan to use it, since GeForce2 Ultra based card is intended for gamers which use high resolutions and don't need a TV-Out.

The graphics processor is hidden behind the radiator with an active cooler which has a bit strange form. The cooler's size is a little bigger, than we used to see. The chip heats not so strong as GeForce , but stronger than GeForce2 GTS.

Installation and drivers

The drivers used at the testing were FastTrax based on the reference driver from NVIDIA. These drivers can't be set on a "clear" OS with Standard VGA-adapter. Let's consider basic settings. Note that all options don't differ from that of reference drivers from NVIDIA.

The basic setting is a manager of all settings:

Here you can do:

- setup the card work in Direct3D:

- setup the card work in OpenGL

- adjust colors and set AGP working modes

- control graphics core and memory working frequencies

- look for system information

Note that S3TC technology is supported from the beginning. DXTC technology is used when the application supports it.

Performance

The testbed is following:

  • Intel Pentium III MHz;
  • Chaintech 6OJV (i) mainboard;
  • MB PC RAM;
  • IBM DPTA 20 GB HDD;
  • Windows 98 SE
  • ViewSonic P (21'') and ViewSonic P (21'')

The test was carried out with Vsync switched off. For a comparative analysis we tested GeForce2 GTS 64MB based videocard - Inno3D Tornado GeForce2 GTS/64 and ATI RADEON 64MB DDR (retail-variant).

Let me start with 2D-graphics. GeForce2 Ultra doesn't differ much from GeForce2 GTS in terms of speed. The test was carried out with WinBench99, Graphics Marks. Below you can see the achieved results for the card speed in 2D at x in bit color mode:

Business Graphics WinmarkHighEnd Graphics Winmark
NVIDIA GeForce DDR
NVIDIA GeForce2 Ultra

In principle, we expected this result. It doesn't make sense to increase the speed in 2D for a game card. As for 2D-graphics, in general the quality is very good. And what is the reason of such 2D quality? First, in this NVIDIA chip there is a completely new RAMDAC (not integrated in the graphics core). The second one concludes in the fact that TV-out moved from a main PCB to a daughter one.

And now we're going to move toward the card's performance in 3D.

While testing we used the following programs:

  • id Software Quake3 v is a game test that demonstrates card's work in OpenGL with usage of a standard demo benchmark demo (the same benchmark was used to receive the results when in FSAA)
  • Rage Expendable (timedemo) is a game test that shows card's work in Direct3D in multitexturing mode
  • Crytek X-Isle is a game test showing Hardware TCL (T&L) performance in OpenGL with usage of all last developments of NVIDIA in 3D-graphics, and with quite complex graphics

Quake3 Arena

The test was carried out in 2 modes: Fast (shows the card's work in bit color ) and High Quality (shows the card's work in bit color ).

Well, the results are quite predictable, high speed increase as compared with the contestants. So it justifies the suffix "Ultra" and belonging to High-End class, since the card provides a good "gamability" even at xx Note that memory overclocking doesn't practically influence the speed.

At x in all modes FSAA provides a good game possibilities, that's why you have a choice between x and x with FSAA2x2. The gamability at xx16 in FSAA 2x2 is also good, but the mode of xx32 FSAA2x2 is of little use. Pay attention to a weak speed decrease when with anisotropic filtering in use.

Expendable

This game will help us to define the card's speed in Direct3D.

In bit color the performance depends on the CPU frequency in all modes, and even the power of 1 GHz processor isn't enough for realization of GeForce2 Ultra.

Crytek X-Isle

This test based on a new engine from Crytek shows us a usage of all latest developments of NVIDIA in 3D-graphics. It's not a synthetic test but a real game situation.

There you can witness a real work of these cards with complex graphics.

In conclusion I'd like to summarize what was said about NVIDIA GeForce2 Ultra performance:

  1. In general, the speed rises good enough as compared with NVIDIA GeForce2 GTS, in bit color the resolution x suits for almost all games;
  2. Lack for a bandwidth of MHz DDR memory affects the speed in bit color at the highest resolutions.

Some more about 3D quality

The quality of 3D-graphics is marvelous! There are full trilinear filtering, anisotropic filtering, S3TC support; it makes possible to watch cool scenes with detailed objects. Though there is a problem of realization of the texture compression method in bit color with S3TC work in autocompression mode. NVIDIA has some errors in the unpack algorithm; when at bit graphics after unpacking we received bit texels.

Additional functions

Since we have a lot of DVDs available in the market today, many users are interested in new cards' work as for DVD-video support.

I have mentioned already that NVIDIA GeForce/GeForce2 chips support Motion Compensation when decoding MPEG2-stream. The most popular soft player that supports this chip fully is InterVideo WinDVD I recommend to download DVD Genie utility for a complex setting of DVD-players, which supports NVIDIA GeForce/GeForce2. When we optimized all functions under NVIDIA GeForce2, the image on NVIDIA GeForce2 Ultra was just perfect. And the processor's load constituted only %!

Conclusion

NVIDIA has released the most powerful accelerator. Creative Labs was first to create the card based on it. It's really of High-End level and intended for super gamers. Undoubtedly, the speed in 3D is unforgettable when playing modern games. You can easily get this card at $ Though I think that the price will fall dawn probably to $ to the coming January.

Highs:

  • The highest performance in 3D-graphics;
  • Perfect 2D quality at high resolutions;
  • Good overclockability of the graphics core;
  • Supports many modern technologies;
  • Good drivers, support for all major functions (including FSAA).

Lows:

  • Considerable speed decrease with FSAA in use;
  • Overprice for the videocards;
  • Difficulty in getting games which can use the whole power of GeForce2 Ultra.

Write a comment below. No registration needed!


Sours: http://ixbtlabs.com/articles/gf2u/index.html

2 ultra geforce

NVIDIA GeForce2 Ultra

New NVIDIA GPU Breaks One Billion Pixels Per Second Barrier

GeForce2 Ultra Delivers Industry&#;s Best Performance and Visual Quality

SANTA CLARA, CA — August 14, — NVIDIA® Corporation (Nasdaq: NVDA) announced today the most powerful 3D graphics processing unit (GPU) ever produced, the GeForce2 Ultra™. Once again, NVIDIA sets new standards for performance and quality by combining the award-winning GeForce2 architecture with its fully compatible driver architecture and its cutting-edge memory subsystem to deliver the industry&#;s fastest GPU. The previous leader was NVIDIA&#;s GeForce2 GTS™.

&#;NVIDIA&#;s core strategy is to deliver a breakthrough product every six months, doubling performance with each generation at a rate of essentially Moore&#;s Law cubed,&#; said Jen-Hsun Huang, president and CEO of NVIDIA. &#;Our unique ability to innovate at this staggering pace is one of the key elements of our competitive advantage. Our consistency has made it possible for our partners to rely on our architecture and roadmap. GeForce2 Ultra exemplifies our commitment to help our partners build innovative and winning products.&#;

&#;We are pleased to be joining NVIDIA as the launch partner for GeForce2 Ultra,&#; said Steven Mosher, vice president of Creative Labs. &#;Creative expects to be the first to provide this groundbreaking technology to users worldwide.&#;

The GeForce2 Ultra has been in volume production for over 30 days. The first GeForce2 Ultra-based add-in cards will be available from Creative Labs in September. Other companies shipping GeForce2 Ultra-based boards include: ELSA, Guillemot, and Hercules, with more add-in-card manufacturers and tier one OEM announcements to come soon.

Raw Graphics Horsepower
The GeForce2 Ultra provides the stunning graphics consumers have come to expect from NVIDIA. Its transform and lighting engines provide over 31 million sustained triangles per second. Its advanced rendering subsystem provides an unprecedented fill rate of up to one billion pixels per second, and two billion texels per second – two to three times the pixel processing power of any graphics processor at any price. The GeForce2 Ultra renders razor-sharp, crystal-clear 2D graphics, even at taxing resolutions as high as x , in bit color.

GeForce2 Ultra is the world&#;s first product to ship with MHz ( MHz effective) DDR memories, producing an astounding GB per second of bandwidth &#; futher proof of NVIDIA&#;s ongoing leadership of the high-speed memory systems critical for high-performance graphics applications.

GeForce2 Ultra takes full advantage of NVIDIA&#;s Detonator 3 (D3) unified driver architecture (UDA). The D3 driver is backward and forward compatible with past, present, and future NVIDIA GPUs, as well as top-to-bottom compatible with all currently manufactured versions of NVIDIA&#;s graphics processors. This greatly simplifies system administration. For example, you could remove an NVIDIA TNT2™ from a system and replace it with a GeForce2 Ultra without changing the graphics software driver. Only NVIDIA offers this level of total compatibility.

Significant 3D features of the GeForce2 Ultra include:

  • Second generation transform and lighting engines
  • NVIDIA Shading Rasterizer (NSR) multi-operation per-pixel rendering engine
  • 64MB of ultra-high-speed ( MHz) double data rate frame buffer memory
  • AGP 4X with Fast Writes support
  • Full support for Microsoft® DirectX® and SGI OpenGL®
  • Optimal performance for current and future CPUs from Intel® and AMD™

The GeForce2 Ultra includes the NVIDIA Shading Rasterizer (NSR), an unprecedented technology that enables advanced per-pixel shading capabilities. The NSR allows per-pixel control of color, shadow, light, reflectivity, emissivity, specularity, gloss, dirt, and other visual and material components used to create amazingly realistic objects and environments. Another key feature of GeForce2 Ultra is its High-Definition Video Processor (HDVP), which enables a variety of crystal-clear HDTV solutions when combined with a mainstream CPU and a low-cost DTV receiver. The HDVP allows mainstream high-performance processors to support all 18 Advanced Television Standards Committee (ATSC) formats with a simple, cost-effective DTV receiver card.

Several highly anticipated titles that are available now or within the next 30 to 45 days take advantage of the breakthrough features and performance of GeForce2 Ultra, including:

  • Real Myst
  • PacMan 3D
  • Sacrifice
  • Black & White
  • Colin McRae Rally 2
  • Sydney
  • Giants
  • Rune
  • Q3A &#; Team Arena
  • Far Gate
  • Warbirds III
  • MDK2

All of these titles take full advantage of the transform and lighting engines found in all NVIDIA GPUs &#; from the original GeForce through the GeForce2 Ultra.

NVIDIA Products
The first 3D graphics semiconductor company to deliver a complete top-to-bottom family of 3D solutions, NVIDIA gives developers the advantage of a unified driver architecture across all implementations. NVIDIA&#;s single-chip GPUs enable high-frame-rate 3D graphics, benchmark-winning 2D graphics, and high-definition digital video processing for unequaled visual realism and real-time interactivity. Optimized for both Microsoft Direct3D and OpenGL APIs, the NVIDIA product line currently includes:

  • NVIDIA Quadro2 Pro™ and Quadro2 MXR™ for the professional workstation, digital content creation, and MCAD markets. The Quadro2 Pro is the fastest workstation graphics solution available for the engineering professional. The Quadro2 MXR provides cost-effective, advanced workstation capabilities and offers the revolutionary TwinView™ display architecture for the mainstream professional. Both include features such as second generation transform and lighting and per-pixel shading.
  • NVIDIA GeForce2 Ultra™, GeForce2 GTS™, and GeForce ™ graphics processing units (GPUs) for the PC enthusiast market. The GeForce2 Ultra is the world&#;s fastest GPU, providing enthusiasts the ultimate gaming platform with unprecedented geometry processing power and per-pixel shading features. Its predecessor, GeForce2 GTS, is a second generation GPU with radical per-pixel shading features and incredible performance. The GeForce processor is the product that started the GPU revolution.
  • NVIDIA GeForce2 MX™ GPU for the corporate desktop market. The GeForce2 MX offers Digital Vibrance Control™, TwinView display options, and a patented unified driver &#; making it the ultimate graphics solution for the corporate user. It leverages the GeForce2 GTS architecture, including features such as second generation transform and lighting and per-pixel shading, yet is priced at one-third of the cost of the original GeForce GPU. The rich feature set, combined with its low cost, makes GeForce2 MX truly a GPU for the masses.
  • NVIDIA TNT2™ Pro for the performance PC market. The TNT2 Pro features NVIDIA&#;s award-winning, fourth-generation, bit 3D architecture that has garnered more than industry awards.
  • NVIDIA Vanta™, Vanta LT, and TNT2 M64 for the commercial and corporate mainstream PC markets. Also based on the original award-winning NVIDIA TNT2 architecture, these processors offer low-cost, high-performance choices for business and entertainment applications.
  • Aladdin TNT2™ for the sub-$ PC market. The first truly integrated graphics chipset; Aladdin TNT2 redefines the possibilities for the lowest cost platforms.
Sours: https://videocardz.net/nvidia-geforce2-ultra
nVidia Riva TNT 2 Vs nVidia GeForce 2 GTS (Install, Benchmarks \u0026 Gameplay)

NVIDIA GeForce2 Ultra

NVIDIA has been executing perfectly ever since the release of their TNT back in   It was October that NVIDIA’s first TNT based cards began shipping by Diamond. Just about one year later, NVIDIA successfully executed the launch of their TNT2 and TNT2 Ultra based products, which eventually overshadowed the Voodoo3 that preceded it. 

The TNT2 Ultra release was the last product NVIDIA brought to market before they switched to a 6-month product cycle.  This new cycle truly strained their competitors since 3dfx was unable to produce the Voodoo4/5 in time to compete with NVIDIA’s next product, the GeForce, which was released 6 months after the TNT2 Ultra. 

The GeForce was an instant hit, there was nothing available that could possibly compete with it and as more mature drivers were released for the card, its performance did nothing but improve.  3dfx, ATI, Matrox and the now defunct graphics division of S3 had no way of competing with the GeForce and the later versions of the card that featured Double Data Rate SDRAM (DDR).  If they couldn’t compete with the GeForce, there was no way they would be able to catch up in time for the launch of the GeForce2 GTS just 6-months later. 

However, the GeForce2 GTS was met with some competition as 3dfx’s Voodoo5 was launched at around the same time and speculation began to form about ATI’s Radeon chip, but even then, we all knew that 6 months after the GTS’ release, NVIDIA would have yet another product that would help them to distance themselves from the competition yet again.

This brings us up to the present day.  While we were patiently waiting for the elusive ‘NV20’ from NVIDIA, NVIDIA has been shipping 32MB and 64MB GeForce2 GTS cards to make their current customer base happy.  With the release of the GeForce2 MX as well as the new Quadro2, it is clear that NVIDIA has really got their act together, and that it has also built up the expectations we had for NV20, the code name of their next product.

We expected NV20 to literally blow everything away; it would mark a depart from the standard GeForce2 GTS core and present us with NVIDIA’s equivalent of ATI’s HyperZ technology that allows for very efficient memory bandwidth usage.  Rumors began hitting the message boards and newsgroups, which speculated on the NV20’s incredible specifications.  Everyone expected the NV20 to have a MHz+ core clock, incredibly fast DDR memory, and an insane amount of memory bandwidth which would be courtesy of its ‘borrowing’ some techniques from tile-based rendering architectures. 

Using our trusty calendar skills, and NVIDIA’s promise to stick to a 6-month product cycle, this put the release of the NV20 in September , under one month away.  With ATI’s Radeon only able to beat a GeForce2 GTS by 10 – 20%, the NV20 would only have to be that much faster in order to beat ATI’s latest creation, and NVIDIA’s closest competitor.  We already assumed the NV20 would be much faster than the Radeon right off the bat.

The specifications we were all expecting were amazing, but guess what guys ‘n gals?  The wonderful NV20 won’t be here until next year.  That’s right, Spring is when you can expect to see the NV20, but NVIDIA won’t be departing from their 6-month product cycle schedule, they are simply departing from what they define a “product” as. 

Originally, NVIDIA’s plans were to release a new chip every Fall and they would have another version of the product every Spring, a sort of “Spring Refresh,” as they liked to call it.  Now, the GeForce has already gotten it’s “Spring Refresh,” the GeForce2 GTS, but now, apparently the GeForce2 GTS isn’t feeling very “fresh” and NVIDIA has decided to give it another refresh, this time in the Fall.

Update 8/17/ There have been reports that the NV20 won't be delayed and it will be released on time contrary to what we've published here. We met with NVIDIA in person and asked them their stance on the issue, according to NVIDIA the NV20 will be out in 4 to 8 months from the release of the GeForce2 Ultra (September). This places the release of the NV20 about 6 months from when the Ultra hits the streets which can be as early as January or as late as May. If you take the average of that range, you get a March release, which does fall in line with our statement of a Spring launch.

So what is this ultra-fresh GeForce2 going to be called?  None other than the GeForce2 Ultra of course.

We’ll let the shock set in before moving on to the specs of this chip…

Sours: https://www.anandtech.com/show/

Similar news:

By Andrey Vorobiev

Today we will show you GeForce2 Ultra based video cards under unusual conditions. One day we took our test system block outside in order to carry out speed testing of the NVIDIA GeForce2 Ultra based video card when the ambient temp was around -2 C. But let's start with our testbed.

  • processor: Intel Pentium III MHz;
  • mainboard: Chaintech 6OJV (i);
  • RAM: MBytes PC;
  • HDD: IBM DPTA 20 GBytes;
  • OS: Windows 98 SE;
  • monitor: NOKIA Xav (17").

The contestants of our Marathon race can be seen on the photo above, nevertheless, here they are (on the photo top-down):

  • Leadtek WinFast GeForce2 Ultra;
  • ASUS AGP-V Ultra;
  • Creative 3D Blaster Annihilator2 Ultra;

But this list is incomplete. Together with the listed ones we tested the Leadtek WinFast GeForce2 Pro which is equipped brighter than a knight in the Middle Ages:

A set of instruments for overclocking is demonstrated on the left photo. There you can see that we utilized a Golden Orb cooler for SlotA-processors, clamps for the Blue Orb cooler (you can make then yourselves), thermo paste or glue. Otherwise, you can use insulation for mounting of a sawed-off part from the Golden Orb on the top raw of the memory chips. The photos clearly show how the modification of the cooler was made. The only I want to notice is that it's necessary to process the lower part of the cooler which is under the fan in order to take off a prominent part, since the center of the cooler will be much higher than the center of the chip.

Considering the fact that the card turned to be very heavy, you should carefully handle such a way equipped video card. No doubt that we were to sacrifice two PCI slots for such overclocking.

1. Overclocking in usual room conditions

This test was intended for all aforementioned video cards. The room temperature was around +22 C, The boards were blown on with a fan from the power supply unit mounted in immediate proximity to the video cards. The following results on overclocking were attained:

  • Leadtek WinFast GeForce2 Ultra - / () MHz;
  • ASUS AGP-V Ultra - / () MHz;
  • Creative 3D Blaster Annihilator2 Ultra - / () MHz;
  • Leadtek WinFast GeForce2 Pro with Golden Orb cooler - / () MHz;

2. Overclocking in extreme conditions

This time we located the testbed on the balcony at the ambient temp equal to -2 C:

When in room conditions it was the Leadtek WinFast GeForce2 Ultra that became a leader in overclocking. That's why we chose exactly this board to be tested in severe conditions. Besides, we enabled a fan in order to intensify the cooling effect. Not too low air temperature (as for example in the freezer) allowed us testing the video card for quite a long time without a danger for the HDD and other parts of the case with moving elements.

The results really stunned us. We managed to lift video card frequency up to / () MHz! At the same time we noticed no artifacts, the board operated stable.

Test results

With the id Software Quake3 v (OpenGL, multitexturing) we considered the speed.

The test result in room and street conditions were brought together, and here you can see what we have received:

Converting into percentage correlation of the performance increase while overclocking, we have got the following:

Despite not very huge overclocking of the Leadtek GeForce2 Pro (comparing to the super-card Leadtek WinFast GeForce2 Ultra), installation of such a big cooler yields fruit. If you remember the maximum possible speed of the GeForce2 Pro chipset is MHz, and for the memory it makes () MHz. With such irregular cooling we easily get a video card outperforming the NVIDIA GeForce2 Ultra and having much lower price.

As to the Leadtek WinFast GeForce2 Ultra when at extreme overclocking - it is a plain wonder. Look at the graphs of the performance gain, especially in bit color. By the way, we can approximately estimate the speed of the future cards on the NV20, since according to rumors the speed increase of the NV20 relatively to the GeForce2 Ultra will be around 50%. We can imagine the FPS for the NV20 in typical for today games of the Quake3 class.

Now we should assess how overclocking of the GeForce2 Ultra chipset and then memory by turns influence the performance gain. I carried out two tests. The first one: at the rated memory frequency in () MHz the GPU speed was increased up to MHz, and the second - at the GPU rated frequency in MHz the memory speed was lifted up to () MHz (these tests were made inside the room):

Lifting of the memory frequency makes greater contribution than the chip overclocking. Note bit color when overclocking the memory - after X the performance increase went down, as a result of lack for such frequency as () MHz.

Conclusion

Well, you see that the Leadtek WinFast GeForce2 Ultra video cards have the best overclockability in their class, showing sometimes wonders (even not taking into account extreme conditions, / () MHz in room conditions is more than just good). Interestingly that increase of memory frequencies up to () Mhz yield fantastic scores in performance increase. I don't think that lifting of the GPU speed up to MHz has very significant effect, taking into account strong dependence of such chipsets on the memory bandwidth. By the way, our tests prove it indirectly.

If you remember a review on the Leadtek WinFast GeForce2 Ultra, you should recollect such test as Crytek X-Isle, which loaded the card so strong that we had to decrease overclocking of the tested sample to / () MHz. I should notice that today's sample managed to reach / () MHz in this test.

Overclocking of the Leadtek WinFast GeForce2 Pro showed us once again that such cards make the purchase optimal among High-End gaming video cards.

At last I should note that this review was not aimed at advertising of any possible overclocking ways, including extreme one. In this article I just wanted to reveal the potential of the GeForce2 Ultra-card for the sport of it. And as far as overclocking is concerned I state that neither me nor iXBT account for any consequences of hasty steps of users. These are irregular modes, and once setting about it a user should realize all possible consequences.

Write a comment below. No registration needed!


Sours: http://ixbtlabs.com/articles/gf2ultraoverclock/index.html


1248 1249 1250 1251 1252