The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    ATi preparing for Tessellation Benchmark war

    Discussion in 'Gaming (Software and Graphics Cards)' started by ziddy123, May 15, 2010.

  1. ziddy123

    ziddy123 Notebook Virtuoso

    Reputations:
    954
    Messages:
    2,805
    Likes Received:
    1
    Trophy Points:
    0
    10.5 will rock!!!! - Guru3D.com Forums

    ATi is now working to assist Tessellation processors with their Stream Processors.

    We have already seen that in some instances ATi's Stream Processors destroy CUDA processors.

    Nvidia did a great job of enabling tessellation, but what it is emulation. They emulate tessellation on their CUDA cores.

    Taking this cue, just think of ATi... Dedicated tessellation unit + emulated tessellation on stream processors.

    This should be interesting.

    I don't expect huge gains for the HD5870M but for the HD5970 which should be the card in comparison to the GTX480, expect some gains. But I would also expect to see this idea implemented on the next generation, 6xxx series. So exciting news for more powerful cards like the HD5870 with 1,600 stream processors and HD5970s, but more so for the future.
     
  2. LaptopNut

    LaptopNut Notebook Virtuoso

    Reputations:
    1,610
    Messages:
    3,745
    Likes Received:
    92
    Trophy Points:
    116
    Very interesting, the more competition the better too. Hopefully it won't just be useful in benchmarks.
     
  3. wishmaster.dj

    wishmaster.dj Notebook Evangelist

    Reputations:
    54
    Messages:
    515
    Likes Received:
    0
    Trophy Points:
    30
    hopefully, this kicks nvidia in the rear and make it work better.
     
  4. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    And while they do that, My 2 x 5870 still cant enable VSync in Eyefinity making all eyefinity games unplayeable for anyone using 1000$ crossfire setups.

    But oh goody, I get tessalation performance...
     
  5. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    No vsync makes a game unplayable? Since when? Perhaps there's a technical reason, like the fact that all the monitors aren't syncing at the same time as it is, so enabling vsync might actually increase flicker?
     
  6. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    i want a price war rather than this.
     
  7. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    Google this:

    Crossfire eyefinity V-sync

    Yes, it makes games unplayable, not having vsync in crossfire increases stutter to an unplayable level a 80 fps game can seem to be 20 fps

    I payed massive money for my set up, you can be sure I spent countless days trying to fix this kind of crap so I am pretty much informed.

    No it is not a technical reason like you say, in fact V-Sync DOES actually work, it is just capped a double the monitors refresh rate (120) it is an ATI bug thay most people owning crossfire+Eyefinity want fixed

    Google it, read, enjoy.

    EDIT: V-Sync has a much more positive efect other than Removing tearing. It makes the whole gaming experience smoother, it evens out the rate at which your card spits out images, and thus if you card is capable of playing at a constant +60fps or whatever your monitors refresh rate is, it removes anykind of microstutter (even MORE SO with crossfire!)

    I have friends that play games at 25 fps, of without vsync and find the game perfectly acceptable smoothness wise, and then there people like me who cannot stand the slightest microstutter. That is the reason people like me pay top cash for top hardware.

    V-Sync has always been an issue for ATI even on single monitor.

    I also found that crossfire and single monitor on top DVI port = 120 fps Cap vsync, but plug (I loled rereading...) it in the bottom DVI port and you get a 60 cap VSYNC. Yes its a bug.
     
  8. DEagleson

    DEagleson Gamer extraordinaire

    Reputations:
    2,529
    Messages:
    3,107
    Likes Received:
    30
    Trophy Points:
    116
    @ PurpleSkyz

    Love the eyefinity setup you have. :)

    And i hope AMD gets the driver s**t together and finally make something with working V-Sync and Scaling Options.
     
  9. HTWingNut

    HTWingNut Potato

    Reputations:
    21,580
    Messages:
    35,370
    Likes Received:
    9,878
    Trophy Points:
    931
    Nice setup there PurpleSkyz. Wish I had the time to organize my crap that well. :) I've gotten over my OCD since I had kids though. Forget about organization, just trying to survive! LOL.
     
  10. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    I think that is the standard micro studder your talking about not v-sync and it exsists in all dual card setups including nvidia.

    I recall using vsysnc just fine with my dual 4850's before I got a single 5870 to replace it.
     
  11. nobodyshero

    nobodyshero Notebook Speculator

    Reputations:
    511
    Messages:
    879
    Likes Received:
    0
    Trophy Points:
    30
    +1 Vicious....

    @ tianxia what do you want in the price war? The 5870's are looking to be 200-300 cheaper on a single card basis compared to the coming competitor 480GTX and 400-500 cheaper on the dual card level.
     
  12. tianxia

    tianxia kitty!!!

    Reputations:
    1,212
    Messages:
    2,612
    Likes Received:
    0
    Trophy Points:
    55
    i want a price war like the last gen.
     
  13. ViciousXUSMC

    ViciousXUSMC Master Viking NBR Reviewer

    Reputations:
    11,461
    Messages:
    16,824
    Likes Received:
    76
    Trophy Points:
    466
    ATI is spanking Nvidia right now, the cards are cheaper, run cooler, use less power, and produce less heat while stayng very close in the performance.

    Plus ATI has Eyefinity! :D

    This is the desktop version, I dont know how they are going to put the new Fermi into a mobile product.. Well I know how but the results wont be pretty.

    It would be big, heavy, loud, have horrible battery life, and cost a lot. The benefit is maybe you can cook on it.

    That is of course they do again what they did last card set and just name it a 480 and its actually a 8800 overclocked or something.
     
  14. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    Noes, I have no micro "studder" :p (I think you didnt grasp taht it is a bug with EYEFINITY ENABLED, also microstutter is not present like it used to be on my 6800gt sli) (and IF IT IS microstutter, and it IS FIXED buy enabling v-sync, then should ATI HURRY THE EF UP and give us a eyefinity+crossfire v-sync fix?)

    Crossfire or not, eyefinity or not, at the exception of quake live (competitive FPS and vsync= mouse lag = ewww) I will not ever play without vsync because

    a) How can you spend 6+ grand on a pc and put up with a tearing line right in the middle of your screen?
    b) As i said, I dont care about having 200000 FPS if the card spits them out unevenly
    (SINGLE card or dual car since 1998, has always been like that). But like I said some people just dont seem to see certain types of stutter, and some dont even care, I envy those people.

    I actually havent noticed any microstutter on my 5870 x 2

    Eyefinity + cross fire = no v-sync bug and I want it fixed NOWz! :p

    V-sync works with eyefinity single card tho, so it isnt a monitor technical problem. I just think its a bug of 2 cards together doubling vync fps cap or something silly like that.
     
  15. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    I vote for ATI expending their 2 man developing team and buying out NVIDIA'S programmer department.
     
  16. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    OCD me? how do you know? who told you?...So the cure is kids? let me get right on that! :p
     
  17. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    ATI 5970M ?? Wait, why are so many people talking about a fantasized GPU that has neither been announced nor hinted at ?
     
  18. PurpleSkyz

    PurpleSkyz Notebook Evangelist

    Reputations:
    103
    Messages:
    308
    Likes Received:
    0
    Trophy Points:
    30
    Crossfire 5870 Desktop, I guess this is a laptop forum maybe I shouldnt talk about anything desktop :/
     
  19. Botsu

    Botsu Notebook Evangelist

    Reputations:
    105
    Messages:
    624
    Likes Received:
    0
    Trophy Points:
    30
    Oh my bad I thought we were talking about a possible Mobility 5970 because I've already seen so many people convinced that there would be one issued in the future when there seems to be no reason to think so.
     
  20. catacylsm

    catacylsm Notebook Prophet

    Reputations:
    423
    Messages:
    4,135
    Likes Received:
    1
    Trophy Points:
    106
    Edit : Does this mean stream compatible cards will get tesselation support to some minor extent?

    I mean i know its to aid the tesselation core/units,

    But if nvidia are emulating it over cuda, whats to stop ati doing this? :).
     
  21. wHo0p3r

    wHo0p3r Notebook Consultant

    Reputations:
    74
    Messages:
    123
    Likes Received:
    0
    Trophy Points:
    30
    I think you're too late to reply the Original Topic, they've already gone to some another topic, something about a Desktop card having some issues (lol).

    Great way to stay on topic, two thumbs up !!! :p


    no offence to anyone am just being sarcastic :D
     
  22. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    Sarcasm is a bannable offense ;) It's hard to say if the older series will get tesselation support. It's possible, but they're also running a lot fewer cores in general than their 5000 series counterparts, so it may not be worthwhile.
     
  23. mobius1aic

    mobius1aic Notebook Deity NBR Reviewer

    Reputations:
    240
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    I think tessellation support only first existed with DX11, so unless specific games running in DX10 and DX10.1 were built to take advantage of the tessellation capabilities of the 2xxx/3xxx/4xxx cards (of which there were none) it would be pretty much moot to try to. It's a standard DX11 feature. I don't think even the DX superset present in the Xbox 360 even takes in account the tessellation hardware in the Xenos GPU. Game developers have to custom code for it, though it has seen some use in 360 games like Halo Wars. I think it was present in Forza 3 and Halo 3 used tessellation for it's water system, which looked very good even by today's standards.
     
  24. Pitabred

    Pitabred Linux geek con rat flail!

    Reputations:
    3,300
    Messages:
    7,115
    Likes Received:
    3
    Trophy Points:
    206
    Official DirectX tessellation is DX11 only, but you are correct that ATI has had tessellation support in its chips for a while. The problem is that the DX11 implementation is slightly different, so you can't really use the older cards to do DX11 currently. It would be awesome if the new drivers changed that, though. Made the older cards DX11 compatible.
     
  25. funky monk

    funky monk Notebook Deity

    Reputations:
    233
    Messages:
    1,485
    Likes Received:
    1
    Trophy Points:
    55
    Running DX11 on a DX10 card would be awesome, however what it seems like to me is that ATI and nVidia are just competing for benchmarks and optimising their cards for them, rather than doing what they should and focusing on gaming.

    Also, I take it that this new tech is 4+ series exlusive?
     
  26. unknown555525

    unknown555525 rawr

    Reputations:
    451
    Messages:
    1,630
    Likes Received:
    0
    Trophy Points:
    55
    ATi has had tech demos for tessellation on cards aging back to the HD 2xxx era.

    But then again nVidia also has tech demos for RayTracing on their GTX480 cards yet RayTracing probably won't make it's way into DirectX for a few more years and it will be a DX12 hardware requirement or something.
     
  27. kondor999

    kondor999 Notebook Consultant

    Reputations:
    75
    Messages:
    129
    Likes Received:
    2
    Trophy Points:
    31
    You are 110% right. This is a game-destroying bug that the review sites never seem to goddamn notice. If you're not just benchmarking, but actually want to *play*, then Vsync is an absolute must. I get 50-90fps in BFBC2 5760x1200 with 2x5870's, but it often feels like less than 30 because it's so jerky.

    ATI - fix this very important and basic issue: Leave the meaningless features to Nvidia with Physx, 3D and CUDA blah blah.

     
  28. lackofcheese

    lackofcheese Notebook Virtuoso

    Reputations:
    464
    Messages:
    2,897
    Likes Received:
    0
    Trophy Points:
    55
    The main problem is that there's much more to DX11 than just tessellation.

    In any case, I'm looking forward to seeing the kind of difference this makes. I suspect that this will eat up quite a lot of Nvidia's lead in some DirectX 11 benchmarks and games, and bring the performance difference much more in line with what you see in DirectX 10 games.