ATi or Nvdida?

Discussion in 'Techforge' started by Order2Chaos, Mar 13, 2008.

  1. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    They adopted a new numbering style (and the "HD" buzzword title) once they got bought by AMD.
  2. Order2Chaos

    Order2Chaos Ultimate... Immortal Administrator

    Joined:
    Apr 2, 2004
    Messages:
    25,217
    Location:
    here there be dragons
    Ratings:
    +21,464
    Is it significantly different? There was still a 2600 SE, Pro, and XT, wasn't there?
  3. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    Well, actually they've dropped extensions like SE and XT, starting with the 3XXX generation.

    First digit (most significant) denotes "generation". Next digit denotes "Family". Final two digits denote "variant". --70 denotes what --00 XT used to denote. 50 replaces "pro". Not sure what took the place of SE. Even though XT is traditionally the top end, I guess they decided to leave some headroom with the new numbers.

    Of course, there is a new suffix: X2, which denotes dual GPU cards.

    So for example:
    3870 HD is equivilant would have been 3800 XT under the old naming system. Or X3800 XT under the older naming system. Or 13800 XT under the old, old naming system.

    edit: As nVidia is up to 9XXX, I expect them to do a naming toss-up as well. Let's hope they don't use the opportunity to pull a bait-and-switch with the GT suffix.
  4. Order2Chaos

    Order2Chaos Ultimate... Immortal Administrator

    Joined:
    Apr 2, 2004
    Messages:
    25,217
    Location:
    here there be dragons
    Ratings:
    +21,464
    Bah. Couldn't leave well enough alone, could they.
  5. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    Nope. I kinda like the new numbers, especially since various manufacturers like to add their own goofy (and sometimes purposely misleading) prefixes and suffixes. I remember certain cards being suffixed "XTX", even though they were NOT XT model cards. This way, it's tougher for them to screw you on the "variant" part since the info is all in the number.

    Then again, most people don't know what all the names mean anyway.

    And of course, every time they change it, it causes a bunch of confusion. Hopefully this convention will last a while.
  6. Order2Chaos

    Order2Chaos Ultimate... Immortal Administrator

    Joined:
    Apr 2, 2004
    Messages:
    25,217
    Location:
    here there be dragons
    Ratings:
    +21,464
    BTW, can you explain nVidia's suffix scheme? I can tell that Ultra (and of course it doesn't help that if you've got an Ultra card, you probably don't want an nForce Ultra motherboard) and GTX are better than the rest, and GT is somewhere in the middle, but nothing beyond that, or even which of the first two is better.
  7. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    Well, I'll do my best.

    It seems like nVidia is labelled more by development / fab cycle.

    The first card of a new architecture (and new leading digit) will generally be a GTX. (Note, this was sorta bunked with the 9600GT just recently).

    The GTX represents the first "go" at a new architecture. Basically, they still have little fab issues, but they want to flex the performance of the new line.

    So with a GTX, you get a big, hot, power-hungry beast of a card that is top of the benchmarks.

    Shortly thereafter, the GTS version will come out. The GTS is basically the way to provide cheaper models / move chips that didn't meet the GTX specs. A GTS is pretty much an underclocked GTX. They come from the cores that didn't "make the cut" to meet the demanding GTX requirements.

    The GTS models are usually also 2-slotters, and are relatively power hungry, with pretty high heat generation (although not as bad as a GTX). Also, these generally have less memory than the GTX versions.

    Then, after a little while LONGER, they will perfect the process a bit, get more testing time in, etc. They will select the really top notch chips and put them in insanely-expensive "Ultra" models that will edge out the GTX (but not by a huge margin generally). These are the most demanding in terms of power and thermals.

    They seem to use the Ultra release to rain on AMD's parade by supplying "another nVidia benchmark record" articles to pop up whenever AMD announces new stuff. Maybe this part is just my imagination, but maybe it isn't. They definitely haven't been releasing them to re-take the performance crown, since the GTX models haven't been contested by ATI for quite some time.

    The preceding three variants all have the same features. They are pretty much different only in terms of clock / memory frequencies and possibly memory size. The perfomance gaps are definitely less than trivial, although not earth-shaking.


    Then, a few months down the line, the fab process will be in order, the capabilites of the chip will be better recognized, and the GT model will come out. The GT model will be FASTER than the GTS (and cheaper too!). It's a single-slot design with significantly lower heat production and power requirements.

    Due to the time gap, the GT might have new features that the other models don't, which makes for an interesting paradox. The GT may have the best feature list, but its not the fastest. It doesn't measure up to the GTX or the Ultra, in terms of clock speeds but really these are the "best valued" cards out there for nVidia.

    Not sure if they still do the GS prefix, but these are generally an indication that "this card sucks" and you probably won't find it on a _800 model. This is most likely a card that OEMs provide to uninformed buyers. It has a model # that begins with the same # as the latest revision, and it has a decent memory size, but its really underpowered.

    As for the mobile prefixes, I have no idea.
    • Agree Agree x 1
  8. Order2Chaos

    Order2Chaos Ultimate... Immortal Administrator

    Joined:
    Apr 2, 2004
    Messages:
    25,217
    Location:
    here there be dragons
    Ratings:
    +21,464
    So, once the GT is out why do the GTS's still cost so much more?

    And this:
    explains why we've got the 9600GT first - it's NOT a new architecture, just a refinement of the 8800GT. I guess with it being much better than the 8800GS, and not quite as good as the 8800GT, they ran out of numbers and suffixes, and since it wasn't quite as good as the 8800GT, they couldn't release it as 8800 GT 2.0 (Now with Higher Efficiency!) or something.

    They probably could have gotten away with calling it an 8600 Ultra or something like that though.
  9. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    This might yield some insight into your earlier question about memory, O2C.

    Despite having slower clock speeds, shader clocks, and memory clocks, the Ultra maintains it's lead on the 8800GTS 512. What area does it have a leg up? 384-bit memory bus width vs 256-bit.


    Edit: shit, I hit edit instead of quote without realizing it... old post = lost.

    To restate it, they remade the 8800GTS with the 65nm process used for the 8800GT. You can tell which one is the new one by memory size. 512 is the new one, 640 or 320 is the old.

    More info: http://www.anandtech.com/video/showdoc.aspx?i=3175
    Last edited: Mar 19, 2008
  10. Sean the Puritan

    Sean the Puritan Endut! Hoch Hech!

    Joined:
    Mar 29, 2004
    Messages:
    25,788
    Location:
    Phoenix, AZ
    Ratings:
    +15,703
    Well, my head a'splode!
  11. Order2Chaos

    Order2Chaos Ultimate... Immortal Administrator

    Joined:
    Apr 2, 2004
    Messages:
    25,217
    Location:
    here there be dragons
    Ratings:
    +21,464
    Which explains everything except why the GTS 640 is still more expensive than the GT, or the GTS 320 is only $10-30 cheaper. Chalk it up to irrationality or high RAM prices, I guess.
  12. Powaqqatsi

    Powaqqatsi Haters gonna hate.

    Joined:
    Apr 15, 2004
    Messages:
    8,388
    Ratings:
    +1,341
    Biggest factor may be uninformed consumers. A lot of folks look at RAM size as the only measure for video card performance. Can't tell you how many times I've seen this on forums:
    "Why is this stupid game so slow on my system?! I have a new processor and enough RAM"

    "What kind of video card do you have?"

    "256 MB"

    ":doh:"
  13. Sokar

    Sokar Yippiekiyay, motherfucker. Deceased Member

    Joined:
    Oct 10, 2004
    Messages:
    14,494
    Location:
    Third stone from the sun
    Ratings:
    +8,351
    I just built a new rig after 5 years and I wish I would have seen this earlier.

    First off, ATI has fallen behind. My last 3 video cards have all been ATI, but they've lost their edge. That's not to say they won't catch up, especially with AMD owning them now and having lots of bread to burn, but for right now, it's Nvidia's game.

    The absolute best bang for the buck right now is an Nvidia 8800GT. It SMOKES..

    Specifically, I got the MSI NX8800GT that is overclocked from the factory with a core of 660mhz (as opposed to the default 600mhz). Using RivaTuner, I've jacked it up to a core clock of 720mhz.

    I benchmarked the system using the native resolution of 1600x1050 and almost got 12,500 on 3dMark06. You would be hard pressed to get that score on most cards without running SLI or Crossfire.

    Keep in mind, to get that speed, I had to overclock the Q6600 Quad Core to 3.1ghz (from 2.4ghz) and slow down the memory speed to 5-5-5-12.

    The only downside is they don't support DX10.1 yet, but then again, no games really do either.
    • Agree Agree x 1