I don't understand why NVidia's (or ATi's for that matter) latest top-of-the-line video cards for laptops are unable to play today's games at native resolutions and higher quality settings for DX10. I understand that laptop hardware is still playing "catch-up", but that should only mean that companies like NVidia should just put more attention into their portable graphics development. Why the hell should a 7-series card be better than all the 8-series cards in terms of performance? There really is no excuse for this.
I have a 8700m, and I can't play the Crysis beta at a native resolution at more than Medium quality settings.
What I really can't get my head around is why the 8700m doesn't have the memory bus that every 7800GTX and up cards have? It's like they got ****y and said "oh yeah, our new series of mobile cards are so revolutionary we don't even need the extra memory". Why don't they just put it in there anyways? It wouldn't hurt, y'know, especially concerning that some 8700m's are being sold LESS than 7950GTX cards are with some vendors. I would not mind paying a little bit more than a 7950GTX with a better memory bus.
Furthermore, they should have got that idea that memory bus really does play a part. 8700m cards in SLI nearly DOUBLE in performance. Bottleneck, much?
I feel like I should have spent that extra $1500 and gone with the D900C. That 8800m better be worth it, and f-ing soon.
-
-
I feel your pain! I strongly believe that Nvidia is making a calculated decision to maximize profits. With ATI offering no competition (highend) they have no incentive. Introduce sub-par DX10 and keep the best as 7950GTX DX9. So you can't have both you have to choose, later when they feel the well is getting dry they will come out with the highend DX10 and to have it all everyone will have to buy new, "profits"! It is a very advantageous position for them to not be selling a card that does both DX10 and highend, no accident there!
Edit: Notebook MFG's are just as responsible for the failure to push for better DX10 cards as Nvidia, does not hurt them if you have to buy a new computer in a year. Not like Desktops where you can just swap out. -
I might sell my clevo m570ru and go for the xps 1730... those dual 8700s I heard have some seriously big muscle, and the x7900 being able to overclock to 3.4ghz... just wow.
-
Every think there may be issues trying to get a graphics card that can draw nearly 200 watts of power into a laptop?
Seriously the only grpahics cards that give good DX10 perfomance at the moment are the desktop 8800GTS and GTX & ULTRA. OK now im not sure about teh GTS but I have a 8800GTX in my PC it weights about 900 grams and is as long as my laptop is wide. Then take into account its ability to draw nearly 200 watts of power and hit 80+ degrees in a well cooled PC with a huge heat sink and you have something that is never going to work in a laptop.
No matter what they do your not going to get that kind of perfomance in a laptop for quite awhile to come.
You could also think about it like this. Even if you double the stream processors and memory bandwidth in the m8700 you end up with a card that is still alot slower than a desktop 8800gts and in raw processing power getting close to half the speed of the 8800GTX. Only problem is you get twice the power draw and twice the heat and being in a laptop that doesn't work really well. -
You could do but a fellow on this site was looking at one and I priced "xotic.com" $500 cheaper. It is 8700GT SLI and Dell is 2X256MB, this is 2X512MB. it is a Sager, I think Dell & Aleinware are overpriced!
-
Ben, Why do you have to always bring logic into the equation?
That said I see your point but dont 2 8700's start to draw a lot of power? Also heat? -
Dunno logic just seems sensible to me. I might of worded it badly but part of my point is two 8700's do draw alot of power and heat and still don;t perfom anywhere near the high end PC standards.
-
You can't really compare desktops and laptops. A desktop will outperform a laptop anyday.
Anyways, to address the OP, dx10 just started, and it's going to take some time to catch on. Right now, the 8000 series nVidia cards are just better performing dx9 cards with "dx10 capability" thrown in just for kicks. So far, I haven't seen anything much better from dx10 than from dx9 ingame. Most of the special effects are very subtle, and I'd rather have a smoothly running game than turn on that little eye-candy.
The 8000 series has lived up to expectations so far. Remember that the true high-end 88 and maybe 89's haven't come out yet. -
Not true, my laptop can and does outperform about 95% of all desktops. My laptop is a portable server replacement, and I know through 3D mark, that my Quadro FX2500M outperforms my brothers XFX 8800GTS. Dont tell me that laptops are underpowered. My Voodoo consumes nearly 400 watts of power, and it is all put to good use. I know that my Voodoo will chew up and destroy anything that is thrown at it. My 2 Hitachi HDD's in raid give me an avg hdd speed of 98.7mb/s, which is higher than most desktop hard drives.
The point is that directx 10 sucks and 9 is better.
My laptop kicks ass, but then again it is 17lbs.
Your reasoning makes sense for all but the D900T,K, and C clevo's.
Every other laptop is underpowered.
K-TRON -
-
I wish my laptop cost $4000, it was nearly double that.
Most of the money is spent not on the computer, but on buying the Voodoo name, and membership into their community.
K-TRON -
How would you like a mobile GPU that eats 150W? That alone could drain most laptop batteries in 30 minutes.
But of course you've developed so many GPU's that you know all about how it could be done, because otherwise you wouldn't be in a position to criticise NVidia or ATI.
Perhaps, if you want things to run smoothly, you should run them at lower resolution. Just like desktop users do, just like other laptop users do. If you want to run everything smoothly at native resolution, then you need a screen without too high native resolution. Big surprise...
Apart from the fact that the GPU would run hotter, use more memory and be far more expensive, you're right, it wouldn't hurt at all. In the same way that jumping from the roof of a building doesn't hurt except for the landing bit.
All laptops have lousy cooling compared to a desktop. Cram a desktop card in there, even if we ignore battery life, and it'd more or less melt your laptop.
All laptops have very limited battery life (and limited power supplies that can't pump out 400W even on AC power)
They also don't have much physical room. Have you seen the size of a 8800GTX card? It's far too big to fit in a laptop.
But even though that shocking revelation is no doubt going to revolutionize the world, it doesn't mean that NVidia "could make a 8800 mobile if they wanted to". They still have to work within the limitations of notebooks. -
It was a rant, you didn't need to go in there and refute every point. Last time I checked, a rant didn't need to be accurate, heck, they don't even need to be coherent.
-
Jalf, don't be a prick. I don't need your smartass rebuttals polluting this thread.
Yes, if you crammed a 8800GTX into a laptop it would melt, hog the battery, etc., I don't give a ****. It still sucks. -
Actually Jalf had all the right to do that. Always expect a rebuttle when you rant. It always happens.
-
Nice rant, I'll agree, but truth sucks and Jalf just dumped a pile of crap on our dreams for a better graphics solutions.
If nVidia had the competition, I can bet you they'd be tossing better and better solutions every week like they do desktops. each one being 10mhz faster then the next, typical.
Laptops are coming closer and closer to being able to swap out cards, but it's still a far off dream.
If you want great native resolution pleasure, get a frigging 1200by800 res screen, like it freaking matters if you're at one res or another in a game, it all looks the same in any native resolution.
Bottom line - enjoy the damn computer and the gameplay not the stupid eye candy, it's just a game...
P.S where the hell did you get the Crysis beta? been waiting for the demo but of course, delayed. -
This is not the place or time to argue larger issues of which we might fall on different sides of. But let it be clear I argued for competition and you argued profits. Go figure! -
It's still a beta damnit.
If you native resolution is 1680x1050 or 1920x1200, you sure won't max it. Even on Oblivion, wich is almost 2years old, needs a 8800GTX to be maxed at 1920x1200.
The 7800GTX has a 256bit bus but the 256MB version is only t 1GhZ, memory bandwith is the same as the 8600GTS.
The 7800GTX 256mb has more bandwith than the 8600GT, but the 8600GT takes AA & REsolution with less drop than the 7800 due to increased shaders.
If mid-range GPUs could max game, why would people buy $500 high end GPU?
And yes, the 8700M GT isn't a high end, it's even a bit slower than the desktop 8600GT -
I suppose up could take a desk top, turn it on it's side, weld a monitor to it, and power it with a marine battery, oh yeah, and put a handle on it and call it a laptop! Then you could have dual 8800 ultras.
-
Nice rant though, Jalf. Very offensive(not a shot at you. I liked it).
-
-
Just installed Company of Heroes today, which I figured because it's a year old game it'd run... y'know... nice enough. With max settings I got a lovely 9fps average. Sweet. -
The Forerunner Notebook Virtuoso
Its a well known fact that the company of heroes patch is shoddy. Its a patch ffs how can it compare to a ground up dx10 game. IT doesn't even add much in terms of visual aesthetics (again its a patch). Now look at the new company of heroes opposing fronts and the dx10 optimization is clear.
-
1920x1200.... 1680x1050 should give you like 2x that speed....
You do know that 1920x1200 is bigger than 1080p, don't you?
And if you're on DX10, that normal, DX10 CoH is only a perf killer. -
just wait for the 9 series, it will be out for sure before next year, if not next month
and all they need to do is throw a 256 bit bus in an 8700gt and it would be better than a 7950gtx, just be patient im sure they will come out with something much better and very powerful, thay always do -
The Forerunner Notebook Virtuoso
The main thing is that its not nvidia, its developers that need to harness dx10 tech.
-
-
Just for everyone information. The 9 series cards arn't a new gnereration of cards really. They are just a die shrink of the 8 series and initially will only be midstream model nothing to better the 8800GTX or Ultra for another year or so.
http://www.penstarsys.com/#gfx_shuffle -
I do know that 1920x1200 is better than 1080p. I actually got resolution specifically so it's capable of playing high-def content. -
These threads make me dream of a laptop with a yatto-everything in its specs that consumes 1W of power. *sigh.
-
-
HDX and xps m2110 are the best ex. although they use mobile components they weigh alot!! -
Then I'll come back and play Crysis and say, "God. The graphics in this game are ancient."
Get it together NVidia (DX10 rant)
Discussion in 'Hardware Components and Aftermarket Upgrades' started by RaiseR RoofeR, Oct 6, 2007.