The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.

    Get it together NVidia (DX10 rant)

    Discussion in 'Hardware Components and Aftermarket Upgrades' started by RaiseR RoofeR, Oct 6, 2007.

  1. RaiseR RoofeR

    RaiseR RoofeR Notebook Consultant

    Reputations:
    11
    Messages:
    171
    Likes Received:
    0
    Trophy Points:
    30
    I don't understand why NVidia's (or ATi's for that matter) latest top-of-the-line video cards for laptops are unable to play today's games at native resolutions and higher quality settings for DX10. I understand that laptop hardware is still playing "catch-up", but that should only mean that companies like NVidia should just put more attention into their portable graphics development. Why the hell should a 7-series card be better than all the 8-series cards in terms of performance? There really is no excuse for this.

    I have a 8700m, and I can't play the Crysis beta at a native resolution at more than Medium quality settings.

    What I really can't get my head around is why the 8700m doesn't have the memory bus that every 7800GTX and up cards have? It's like they got ****y and said "oh yeah, our new series of mobile cards are so revolutionary we don't even need the extra memory". Why don't they just put it in there anyways? It wouldn't hurt, y'know, especially concerning that some 8700m's are being sold LESS than 7950GTX cards are with some vendors. I would not mind paying a little bit more than a 7950GTX with a better memory bus.

    Furthermore, they should have got that idea that memory bus really does play a part. 8700m cards in SLI nearly DOUBLE in performance. Bottleneck, much?

    I feel like I should have spent that extra $1500 and gone with the D900C. That 8800m better be worth it, and f-ing soon.
     
  2. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    I feel your pain! I strongly believe that Nvidia is making a calculated decision to maximize profits. With ATI offering no competition (highend) they have no incentive. Introduce sub-par DX10 and keep the best as 7950GTX DX9. So you can't have both you have to choose, later when they feel the well is getting dry they will come out with the highend DX10 and to have it all everyone will have to buy new, "profits"! It is a very advantageous position for them to not be selling a card that does both DX10 and highend, no accident there!

    Edit: Notebook MFG's are just as responsible for the failure to push for better DX10 cards as Nvidia, does not hurt them if you have to buy a new computer in a year. Not like Desktops where you can just swap out.
     
  3. RaiseR RoofeR

    RaiseR RoofeR Notebook Consultant

    Reputations:
    11
    Messages:
    171
    Likes Received:
    0
    Trophy Points:
    30
    I might sell my clevo m570ru and go for the xps 1730... those dual 8700s I heard have some seriously big muscle, and the x7900 being able to overclock to 3.4ghz... just wow.
     
  4. BenArcher

    BenArcher Notebook Consultant

    Reputations:
    20
    Messages:
    189
    Likes Received:
    0
    Trophy Points:
    30
    Every think there may be issues trying to get a graphics card that can draw nearly 200 watts of power into a laptop?

    Seriously the only grpahics cards that give good DX10 perfomance at the moment are the desktop 8800GTS and GTX & ULTRA. OK now im not sure about teh GTS but I have a 8800GTX in my PC it weights about 900 grams and is as long as my laptop is wide. Then take into account its ability to draw nearly 200 watts of power and hit 80+ degrees in a well cooled PC with a huge heat sink and you have something that is never going to work in a laptop.

    No matter what they do your not going to get that kind of perfomance in a laptop for quite awhile to come.

    You could also think about it like this. Even if you double the stream processors and memory bandwidth in the m8700 you end up with a card that is still alot slower than a desktop 8800gts and in raw processing power getting close to half the speed of the 8800GTX. Only problem is you get twice the power draw and twice the heat and being in a laptop that doesn't work really well.
     
  5. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    You could do but a fellow on this site was looking at one and I priced "xotic.com" $500 cheaper. It is 8700GT SLI and Dell is 2X256MB, this is 2X512MB. it is a Sager, I think Dell & Aleinware are overpriced!
     
  6. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    Ben, Why do you have to always bring logic into the equation?
    That said I see your point but dont 2 8700's start to draw a lot of power? Also heat?
     
  7. BenArcher

    BenArcher Notebook Consultant

    Reputations:
    20
    Messages:
    189
    Likes Received:
    0
    Trophy Points:
    30
    Dunno logic just seems sensible to me. I might of worded it badly but part of my point is two 8700's do draw alot of power and heat and still don;t perfom anywhere near the high end PC standards.
     
  8. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    You can't really compare desktops and laptops. A desktop will outperform a laptop anyday.

    Anyways, to address the OP, dx10 just started, and it's going to take some time to catch on. Right now, the 8000 series nVidia cards are just better performing dx9 cards with "dx10 capability" thrown in just for kicks. So far, I haven't seen anything much better from dx10 than from dx9 ingame. Most of the special effects are very subtle, and I'd rather have a smoothly running game than turn on that little eye-candy.

    The 8000 series has lived up to expectations so far. Remember that the true high-end 88 and maybe 89's haven't come out yet.
     
  9. K-TRON

    K-TRON Hi, I'm Jimmy Diesel ^_^

    Reputations:
    4,412
    Messages:
    8,077
    Likes Received:
    2
    Trophy Points:
    205
    Not true, my laptop can and does outperform about 95% of all desktops. My laptop is a portable server replacement, and I know through 3D mark, that my Quadro FX2500M outperforms my brothers XFX 8800GTS. Dont tell me that laptops are underpowered. My Voodoo consumes nearly 400 watts of power, and it is all put to good use. I know that my Voodoo will chew up and destroy anything that is thrown at it. My 2 Hitachi HDD's in raid give me an avg hdd speed of 98.7mb/s, which is higher than most desktop hard drives.
    The point is that directx 10 sucks and 9 is better.
    My laptop kicks ass, but then again it is 17lbs.

    Your reasoning makes sense for all but the D900T,K, and C clevo's.

    Every other laptop is underpowered.

    K-TRON
     
  10. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    You laptop cost $4000. Give me $4000 and I'll make a better performing desktop, and rent a room to put it in. You have to compare with equal $$$.
     
  11. K-TRON

    K-TRON Hi, I'm Jimmy Diesel ^_^

    Reputations:
    4,412
    Messages:
    8,077
    Likes Received:
    2
    Trophy Points:
    205
    I wish my laptop cost $4000, it was nearly double that.

    Most of the money is spent not on the computer, but on buying the Voodoo name, and membership into their community.

    K-TRON
     
  12. Jalf

    Jalf Comrade Santa

    Reputations:
    2,883
    Messages:
    3,468
    Likes Received:
    0
    Trophy Points:
    105
    Or rather, games try to make the best possible graphics. The faster the GPU gets, the more demanding graphics developers will come up with. So yes, the GPU will always be too slow to run everything at max settings.

    Or perhaps it should only mean that if you want an extreme high end system, you have to get a desktop, where, you know, extreme high-end performance is actually possible.
    How would you like a mobile GPU that eats 150W? That alone could drain most laptop batteries in 30 minutes.

    But of course you've developed so many GPU's that you know all about how it could be done, because otherwise you wouldn't be in a position to criticise NVidia or ATI.

    Because you're comparing a the high end of a 7 series card with the mid-range of 8? It took a while before the high-end 7's came out in mobile versions too. They weren't there from day one. And with 8, it's exactly the same. They release midrange cards first, and at some point they're going to add something like the 8800's. Because they have to, you know, work with the limitations laptops have. (Such as the fact that 150W power consumption is unacceptable)

    Which native resolution specifically? And does NVidia have a obligation to make everything run smoothly at *your* native resolution?
    Perhaps, if you want things to run smoothly, you should run them at lower resolution. Just like desktop users do, just like other laptop users do. If you want to run everything smoothly at native resolution, then you need a screen without too high native resolution. Big surprise...

    Other than the bigger, more complex and harder to manufacture chip, the expensive, power-hungry extra memory and the more complex circuit board, you're absolutely right.

    Apart from the fact that the GPU would run hotter, use more memory and be far more expensive, you're right, it wouldn't hurt at all. In the same way that jumping from the roof of a building doesn't hurt except for the landing bit.

    I feel like you should 1) spend the next 5 years designing GPU's since obviously NVidia could learn a lot from you. And 2), I feel like you should buy a desktop if you want high-end gaming. No matter how many billions NVidia pump into mobile GPU's, and no matter how much geniuses like you are able to tell them what they're doing wrong, they still have to follow some fundamental rules when building mobile hardware.
    All laptops have lousy cooling compared to a desktop. Cram a desktop card in there, even if we ignore battery life, and it'd more or less melt your laptop.
    All laptops have very limited battery life (and limited power supplies that can't pump out 400W even on AC power)

    They also don't have much physical room. Have you seen the size of a 8800GTX card? It's far too big to fit in a laptop.

    No ****! You mean they want to make money? Gosh, I'd never thought that. What kind of company actually wants to make money?
    But even though that shocking revelation is no doubt going to revolutionize the world, it doesn't mean that NVidia "could make a 8800 mobile if they wanted to". They still have to work within the limitations of notebooks.
     
  13. Lithus

    Lithus NBR Janitor

    Reputations:
    5,504
    Messages:
    9,788
    Likes Received:
    0
    Trophy Points:
    205
    It was a rant, you didn't need to go in there and refute every point. Last time I checked, a rant didn't need to be accurate, heck, they don't even need to be coherent.
     
  14. RaiseR RoofeR

    RaiseR RoofeR Notebook Consultant

    Reputations:
    11
    Messages:
    171
    Likes Received:
    0
    Trophy Points:
    30
    That was the intention. Just a rant. *sigh* whatever...

    Jalf, don't be a prick. I don't need your smartass rebuttals polluting this thread.

    Yes, if you crammed a 8800GTX into a laptop it would melt, hog the battery, etc., I don't give a ****. It still sucks.
     
  15. jessi3k3

    jessi3k3 Notebook Evangelist

    Reputations:
    211
    Messages:
    520
    Likes Received:
    0
    Trophy Points:
    30
    Actually Jalf had all the right to do that. Always expect a rebuttle when you rant. It always happens.
     
  16. B2TheEYo

    B2TheEYo Notebook Deity

    Reputations:
    141
    Messages:
    939
    Likes Received:
    0
    Trophy Points:
    0
    Nice rant, I'll agree, but truth sucks and Jalf just dumped a pile of crap on our dreams for a better graphics solutions.

    If nVidia had the competition, I can bet you they'd be tossing better and better solutions every week like they do desktops. each one being 10mhz faster then the next, typical.

    Laptops are coming closer and closer to being able to swap out cards, but it's still a far off dream.

    If you want great native resolution pleasure, get a frigging 1200by800 res screen, like it freaking matters if you're at one res or another in a game, it all looks the same in any native resolution.

    Bottom line - enjoy the damn computer and the gameplay not the stupid eye candy, it's just a game...

    P.S where the hell did you get the Crysis beta? been waiting for the demo but of course, delayed.
     
  17. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    Jalf, you mean little Elf! I said “maximize” profits, you conveniently leave that out of your response. You also quoted out of context, as you dropped much of what I said (not the first time), specifically the lack of competition with ATI in the highend market. I stand by my first post in it's entirety, your hen pecking is just that! Your implication that I am against profits is a fantasy created in your own mind, in the future try and read more than one sentence before drawing conclusions. I said what I believe, I clearly stated it is what I believed and is valid even if incorrect! Which at this point I will not concede!

    This is not the place or time to argue larger issues of which we might fall on different sides of. But let it be clear I argued for competition and you argued profits. Go figure! :eek:
     
  18. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105
    It's still a beta damnit.

    If you native resolution is 1680x1050 or 1920x1200, you sure won't max it. Even on Oblivion, wich is almost 2years old, needs a 8800GTX to be maxed at 1920x1200.

    The 7800GTX has a 256bit bus but the 256MB version is only t 1GhZ, memory bandwith is the same as the 8600GTS.

    The 7800GTX 256mb has more bandwith than the 8600GT, but the 8600GT takes AA & REsolution with less drop than the 7800 due to increased shaders.


    If mid-range GPUs could max game, why would people buy $500 high end GPU?

    And yes, the 8700M GT isn't a high end, it's even a bit slower than the desktop 8600GT
     
  19. Kimber 1911

    Kimber 1911 Notebook Consultant

    Reputations:
    4
    Messages:
    109
    Likes Received:
    0
    Trophy Points:
    30
    I suppose up could take a desk top, turn it on it's side, weld a monitor to it, and power it with a marine battery, oh yeah, and put a handle on it and call it a laptop! Then you could have dual 8800 ultras.
     
  20. fabarati

    fabarati Frorum Obfuscator

    Reputations:
    1,904
    Messages:
    3,374
    Likes Received:
    0
    Trophy Points:
    105
    3Dmark is not a very good indicator of performance... A 8800 GTS is faster than a go7900GTX (which the FX2500M is base on). If in real life gaming this is not so in your particular case, your brother's 8800GTS is underperforming.

    That's actually not true. The first Geforce 7 go was the 7800GTX (or possibly vanilla go7800). It arrived a few months befor the lower end ones (in 2005 as oppsed to 2006).

    Nice rant though, Jalf. Very offensive :D (not a shot at you. I liked it).
     
  21. powerpack

    powerpack Notebook Prophet

    Reputations:
    7,101
    Messages:
    5,757
    Likes Received:
    0
    Trophy Points:
    0
    Sweden and Denmark lovefest tommorow at 11:00. :D
     
  22. RaiseR RoofeR

    RaiseR RoofeR Notebook Consultant

    Reputations:
    11
    Messages:
    171
    Likes Received:
    0
    Trophy Points:
    30
    FilePlanet offered beta keys to its members. The offer's filled.

    Yes, I'll admit it's 1920x1200.

    Just installed Company of Heroes today, which I figured because it's a year old game it'd run... y'know... nice enough. With max settings I got a lovely 9fps average. Sweet.
     
  23. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    Its a well known fact that the company of heroes patch is shoddy. Its a patch ffs how can it compare to a ground up dx10 game. IT doesn't even add much in terms of visual aesthetics (again its a patch). Now look at the new company of heroes opposing fronts and the dx10 optimization is clear.
     
  24. JCMS

    JCMS Notebook Prophet

    Reputations:
    455
    Messages:
    4,674
    Likes Received:
    0
    Trophy Points:
    105

    1920x1200.... 1680x1050 should give you like 2x that speed....

    You do know that 1920x1200 is bigger than 1080p, don't you?

    And if you're on DX10, that normal, DX10 CoH is only a perf killer.
     
  25. narsnail

    narsnail Notebook Prophet

    Reputations:
    2,045
    Messages:
    4,461
    Likes Received:
    1
    Trophy Points:
    106
    just wait for the 9 series, it will be out for sure before next year, if not next month

    and all they need to do is throw a 256 bit bus in an 8700gt and it would be better than a 7950gtx, just be patient im sure they will come out with something much better and very powerful, thay always do :)
     
  26. The Forerunner

    The Forerunner Notebook Virtuoso

    Reputations:
    1,105
    Messages:
    3,061
    Likes Received:
    0
    Trophy Points:
    105
    The main thing is that its not nvidia, its developers that need to harness dx10 tech.
     
  27. STEvil

    STEvil Notebook Consultant

    Reputations:
    119
    Messages:
    216
    Likes Received:
    0
    Trophy Points:
    30
    Should get him to turn off FSAA, or stop resetting his vendor and device ID so that his 8400 shows at as an 8800... rofl.
     
  28. BenArcher

    BenArcher Notebook Consultant

    Reputations:
    20
    Messages:
    189
    Likes Received:
    0
    Trophy Points:
    30
    Just for everyone information. The 9 series cards arn't a new gnereration of cards really. They are just a die shrink of the 8 series and initially will only be midstream model nothing to better the 8800GTX or Ultra for another year or so.

    http://www.penstarsys.com/#gfx_shuffle
     
  29. RaiseR RoofeR

    RaiseR RoofeR Notebook Consultant

    Reputations:
    11
    Messages:
    171
    Likes Received:
    0
    Trophy Points:
    30
    I haven't paid that much attention to it. How well does it run by comparison?

    I've put the Crysis beta on very low resolutions, like 1280x1024, and I still didn't notice that much of a performance gain. I heard some laptops actually have a little bit of a performance gain at native resolutions (meaning better than *predicted*).

    I do know that 1920x1200 is better than 1080p. I actually got resolution specifically so it's capable of playing high-def content.
     
  30. Kierkes

    Kierkes Misanthrope

    Reputations:
    186
    Messages:
    855
    Likes Received:
    0
    Trophy Points:
    30
    These threads make me dream of a laptop with a yatto-everything in its specs that consumes 1W of power. *sigh.
     
  31. Inkjammer

    Inkjammer Notebook Deity

    Reputations:
    205
    Messages:
    717
    Likes Received:
    0
    Trophy Points:
    30
    And with a power brick like a cinder block, complete with minature smokestacks for heat exchange. It can power an uber-laptop... or one Xbox.
     
  32. wolfraider

    wolfraider Grand Viezir of Chaos

    Reputations:
    193
    Messages:
    957
    Likes Received:
    0
    Trophy Points:
    30
    THat is actually a good idea but unfortunately no one is buying it!!

    HDX and xps m2110 are the best ex. although they use mobile components they weigh alot!!
     
  33. Kierkes

    Kierkes Misanthrope

    Reputations:
    186
    Messages:
    855
    Likes Received:
    0
    Trophy Points:
    30
    Lol. Either way, people are making components smaller and smaller less and less hot. Eventually, and I mean in the near-mid future, due to the exponential growth rate, we're going to see just about everything in tera. :) Then I'll come back and play Crysis and say, "God. The graphics in this game are ancient."