GeForce

From Wikipedia, the free encyclopedia

Jump to: navigation, search
GeForce graphics processor

The GeForce logo used since 2007
Invented by Nvidia

GeForce is a brand of PC graphics processor units (GPUs) designed by Nvidia. The first GeForce products were designed and marketed for the high-margin PC gaming market, but later the product's releases expanded the product line to cover all tiers of the graphics market, from low-end to high-end. As of 2008, there have been ten iterations of the design. Nvidia only designs the graphics chipset; consumer-ready video cards are assembled by third parties. While several companies (notably, Intel) design low-end GPUs, only Nvidia's GeForce and ATI's Radeon series compete for the high-end GPU market. Intel however seems to have expressed a desire to enter the high end market with its recent development of Larrabee. Nvidia GeForce series are now involved in a rivalry with Intel and ATI to develop GPGPUs (General Purpose-Graphics Processor Units) which accelerate not only gaming, but also a broad range of other applications, ranging from graphics design and multimedia playback to scientific data analysis and simulation.

Contents

[edit] Name origin

The "GeForce" name originated[not in citation given] from a contest held by Nvidia in early 1999. Called "Name That Chip", the contest called out to the public to name the successor to the RIVA TNT2 line of graphics boards. There were over 12,000 entries received and 7 winners received a RIVA TNT2 Ultra graphics card as a reward.[1][2]

[edit] Generations

GeForce 256 
Launched on August 31, 1999, the GeForce 256 (NV10) was the first PC graphics chip with hardware transform, lighting, and shading although 3D games utilizing this feature did not appear until later. Initial GeForce 256 boards shipped with SDR SDRAM memory, and later boards shipped with faster DDR SDRAM memory.
GeForce2 
Launched in April 2000, the first GeForce2 (NV15) was another high-performance graphics chip. Nvidia moved to a twin texture processor per pipeline (4x2) design, doubling texture fillrate per clock compared to GeForce 256. Later, Nvidia released the GeForce2 MX (NV11), which offered performance similar to the GeForce 256 but at a fraction of the cost. The MX was a compelling value in the low/mid-range market segments and was popular with OEM PC manufacturers and users alike.
GeForce3 
Launched in February 2001, the GeForce3 (NV20) introduced DirectX 8.0 programmable pixel shaders to the GeForce family. It had good overall performance and shader support, making it popular with enthusiasts although it never hit the midrange price point. A derivative of the GeForce3, NV2A, was developed for the Microsoft Xbox game console.
GeForce4 
Launched in February 2002, the high-end GeForce4 Ti (NV25) was mostly a refinement to the GeForce3. The biggest advancements included enhancements to anti-aliasing capabilities, an improved memory controller, a second vertex shader, and a manufacturing process size reduction to increase clock speeds. Another "family member," the budget GeForce4 MX, was based on the GeForce2, with a few additions from the new GeForce4 Ti line. It targeted the value segment of the market and lacked pixel shaders.
GeForce FX 
Officially launched in November 2002, the GeForce FX (NV30) was a huge change in architecture compared to its predecessors. The GPU was designed not only to support the new Shader Model 2 specification but also to perform well on older DirectX 7 and 8 titles. However, initial models suffered from weak floating point shader performance and excessive heat which required two-slot cooling solutions. Products in this series carry the 5000 model number, as it is the fifth generation of the GeForce, though Nvidia marketed the cards as GeForce FX instead of GeForce 5 to show off "the dawn of cinematic rendering".
GeForce 6 
Launched in April 2004, the GeForce 6 (NV40) added Shader Model 3.0 support to the GeForce family, while correcting the weak floating point shader performance of its predecessor. It also implemented high dynamic range imaging and introduced SLI (Scalable Link Interface) and PureVideo capability.
GeForce 7 
The 7th generation GeForce (G70/NV47) was launched in June 2005. The design was a refined version of GeForce 6, with the major improvements being a widened pipeline and an increase in clock speed. The GeForce 7 also offers new transparency supersampling and transparency multisampling anti-aliasing modes (TSAA and TMAA). These new anti-aliasing modes were later enabled for the GeForce 6 series as well.
A modified version of GeForce 7800GTX called the RSX 'Reality Synthesizer' is used as the main GPU in the PlayStation 3 from Sony.
GeForce 8 
Released on November 8, 2006, the 8th generation GeForce (G80 originally) was the first ever GPU to fully support DirectX 10. Built on a brand new architecture, manufactured in 80 nm, it has a fully unified shader architecture. Originally just the 8800GTX, the GTS was released months into the product line's life, and it took nearly 6 months for mid-range and OEM/mainstream cards to be integrated into the 8-series. The Die-shrink down to 65 nm and a revision to the G80 design, codenamed G92, were implemented into the 8 series with the 8800GS, the 8800GT, and 8800GTS-512. First released on October 29, 2007, almost one whole year after the initial G80 release.
GeForce 9 
The first product was released on February 21, 2008.[3] Not even four months elder than the initial G92 release, all 9-series designs, both currently-out and speculated, are simply revisions to existing late 8-series products. The 9800GX2 uses two G92 GPUs, as used in later 8800 cards, in a dual PCB configuration while still only requiring a single PCI-Express 16x slot. The 9800GX2 utilises two separate 256-bit memory busses, one for each GPU and its respective 512MB of memory, which equates to an overall of 1GB of memory on the card (although the SLI configuration of the chips necessitates mirroring the frame buffer between the two chips, thus effectively having the memory performance of a 256-bit/512MB configuration). The later 9800GTX features a single G92 GPU, 256-bit data bus, and 512MB of GDDR3 memory[4]. Prior to the release, no concrete information was known except officials claiming the next generation products having close to 1 TFLOPS performance while the GPU cores still being manufactured in the 65 nm process, and reports about Nvidia downplaying the significance of DirectX 10.1.[5]
GeForce 200 Series 
Based on the GT200 graphics processor consisting of 1.4 Billion transistors, the 200 series launched at 0630 PDT on 16 June 2008.[6] The next generation of the GeForce series takes the card-naming scheme in a new direction, by replacing the series number (such as 8800 for 8-series cards) with the GTX or GTS suffix (which used to go at the end of card names, denoting their 'rank' among other similar models), and then adding model-numbers such as 260 and 280 after that. The series features the new GT200 core on a 65nm.[7] The first products will be GeForce GTX 260 and the more expensive GeForce GTX 280.[8]
GeForce 300 Series 
According to reports, the GT300 series will make its debut around the end of 2009 on a 40 nm fabrication process and will be incorporating DirectX 11 and Shader Model 5.0.[9] [10]

[edit] Mobile GPUs

Since the GeForce2, Nvidia has produced a number of graphics chipsets for notebook computers under the GeForce Go branding. Most of the features present in the desktop counterparts are present in the mobile ones. However these GPUs do not perform as well as their desktop counterpart. Nvidia later rebranded their mobile chipset for the GeForce 8 based GPUs the GeForce 8M series. In 1st Quarter 2009 the 285 and 295 were released, that best on the 200 core with improved performance

[edit] Product naming scheme

With the release of the GTX 200 series of cards, nVidia cards now use a prefix to designate their category. So far, only the GTX (Enthusiast) prefix has been announced. The first digit in the name of a card represents its generation, while the second and third digits represent the performance of the card relative to others in the family.

The company followed a naming scheme similar to that shown below until the release of the GTX 2xx series cards.

Number range (steps of 50) Category Suffixes1 Price range2 (USD) Shader amount3 Memory Outputs Example products
Type Width (bit) Size (MiB)
000-450 Mainstream LE, GS, GT ≤$100 ≤25% DDR2 25%-50% ~25% VGA/DVI GeForce 7300GS, GeForce 6200
500-750 Performance LE, GS, GSO, GT, GTS $100–$300 25%-50% DDR2, GDDR3 50%-75% 50%-75% VGA/DVI
Two DVI
GeForce 7600GT, GeForce 8600GT/GTS
800-950 Enthusiast GS, GT, GTS, GTO, GTX, Ultra, GX2 ≥$200 50%-100% GDDR3 75%-100% 50%-100% VGA/DVI
Two DVI
DVI/HDMI
GeForce 8800 GTX/Ultra, GeForce 9800 GT/GTX/GX2
  • This scheme[clarification needed] is only applicable to the GeForce FX and above series video cards, however GeForce4 and earlier cards follow a similar pattern.
  • 1: Suffixes indicate its performance layer, and those listed are in order from weakest to most powerful
  • 2: Price range only affects the most recent generation and is a generalization based on pricing patterns.
  • 3: Shader amount compares the number of shaders pipelines or units in that particular model range to the highest model possible in the generation.

[edit] List of manufacturers

[edit] See also

[edit] References

  1. ^ "Winners of the Nvidia Naming Contest". Nvidia. 1999. Archived from the original on 2000-06-08. http://web.archive.org/web/20000608011648/http://www.nvidia.com/namingcontest. Retrieved on 2007-05-28. 
  2. ^ Taken, Femme (1999-04-17). "Nvidia "Name that chip" contest". Tweakers.net. http://tweakers.net/nieuws/1967/nVidia-Name-that-chip-contest.html. Retrieved on 2007-05-28. 
  3. ^ Brian Caulfield (2008-01-07). "Shoot to Kill". Forbes.com. http://www.forbes.com/home/technology/forbes/2008/0107/092.html. Retrieved on 2007-12-26. 
  4. ^ "NVIDIA GeForce 9800 GTX". http://www.nvidia.com/object/geforce_9800gtx.html. Retrieved on 2008-05-31. 
  5. ^ Crytek+Microsoft+Nvidia+Downplay+DirectX+101/article9656.htm DailyTech report: Crytek, Microsoft and Nvidia downplay DirectX 10.1, retrieved December 4, 2007
  6. ^ "NVIDIA GeForce GTX 280 Video Card Review". Benchmark Reviews. 2008-06-16. http://benchmarkreviews.com/index.php?option=com_content&task=view&id=179&Itemid=1. Retrieved on 2008-06-16. 
  7. ^ "Geforce GTX 280 to launch on June 18th". Fudzilla.com. http://www.fudzilla.com/index.php?option=com_content&task=view&id=7364&Itemid=1. Retrieved on 2008-05-18. 
  8. ^ "Detailed Geforce GTX 280 Pictures". VR-Zone. 2008-06-03. http://www.vr-zone.com/articles/Detailed_Geforce_GTX_280_Pictures/5826.html. Retrieved on 2008-06-03. 
  9. ^ "Nvidia to launch new GT300 GPU in 1Q09; will showcase GT200-based cards at CES". Digitimes. 2008-12-16. http://www.digitimes.com/news/a20081216PD202.html. Retrieved on 2008-12-25. 
  10. ^ "NVIDIA: GT206, GT212, GT216 and GT300 come next year, 40nm". Hardspell. 2008-10-27. http://en.hardspell.com/doc/showcont.asp?news_id=4307. Retrieved on 2009-2-5. 

[edit] External links

Personal tools