which is better? Chaz's post about mobile graphics cards does not include the ATI one unless i missed it somehow so i was wondering if its better than the 8600 gt. thanks for any replies
-
masterchef341 The guy from The Notebook
the ati card hasn't really been released yet. i don't think people have them...
xt = gt, not xt = gs
vram is inconsequential. -
the 2600 is probably the better buy, as it's 64nm, so it runs cooler
-
From what ive heard, the ATi cards will be quite a bit better than the NVidea Cards.
Ex ATi's HD 2600 might be equal to the Go 8700.
Thats just the word on the street. Once again, it has not been released yet, we will have to wait and see. -
masterchef341 The guy from The Notebook
it never works out that way... ever... except the horrible nvidia 5 series. don't put faith in 3dmark scores.
-
IF it is better, then ones going straight into my C90
-
-
No, No joke, that is what i heared from my friends Dad, Some manager of Dell, that knows all about these cards, and At dinner one night, i was at his house, and for some reason that topic came up, and that is what he thought. Hey, People said that the X1400 in no way could be better than the Go 7400, but look what happened.
-
Uh, no one knows (at least not from this forum, OK probably someone from ATI knows...I hope) how this card would perform, so you will see nothing but speculations in this thread...
At a wild guess, the HD 2600 may happen to be just as good as the 8600M GT, or in other words they might be close competitors...and I cannot back up that.
Guess, we will find out soon. -
Hopefully it will perform better than its desktop counterpart.
-
The most recent drivers have improved the desktop 2600XT a lot now, 2600XT crossfire is almost on a par with the 8800GTS 320MB, and cheaper.
-
Sorry to tell you that if you go with two of the cheaper HD 2600, you'll get nothing but tears of grief with your performance. A single 2600 is just on par with its 8600 counter parts, newer drivers or not. I don't see the mobile portion giving you 8600m gt crushing performance.
On the one hand though, the 2600 is cooler, but not the best energy user in the world. -
masterchef341 The guy from The Notebook
so two crossfire x2600 xt cards ALMOST equal a low-high end 8800 gts...
and this means what? -
-
-
8600m GT is better. That ati card was actually meant to compete with the nvidia 8600m GS.
-
Nobody knows anything as of yet, wait for a 8510p review to come out.
-
http://www.canadacomputers.com/index.php?do=ShowProduct&cmd=pd&pid=014760&cid=999.243.272
vs
http://www.canadacomputers.com/index.php?do=ShowProduct&cmd=pd&pid=014486&cid=999.243.390
You save approx $10 like I said. But remember with the Nvidia you get a higher memory interface and can always go SLI later, again when it has much more software support, when the card gets cheaper for extreme performance. -
nobody knows? i thought people with bodies worked in AMD/ATI? if so then they should know
I doubt ATI would release a card that wasn't on par with Nvidia's mid-range offerings, so expect a 2600 variant to be just as powerful as the 8600GT or 8700GT even. -
masterchef341 The guy from The Notebook
you can get that same card (different brand perhaps) for 280 on newegg.com
its all speculation, but my speculation is going to be accurate:
check it out...
mobility hd 2600 = 8600m gs
mobility hd 2600 xt = 8600m gt -
-
masterchef341 The guy from The Notebook
unfortunate.
-
The 2600XT is somewhat in between of a 8600 GT and a GTS.. tat's from the desktop counterpart. -
-
hope im not changing the subject but whad about ati1400 vs. gforce 7300 ?im goin to buy a dell inspiron c2d 2,7200 ,2gig ram,with im not so sure ati1400 or gfprce7300 (which is jus little more expensive)..salesmen say theres no difference in gaming and gforce s better but just in professional ghraphic actions..is it true?
-
The ATI is goin to pwn the Nvidia but still, they r low end... go for the ATI...btw, salesman lies to get sales, never believe what they say...
-
thnx..im not a gamer i jus wan some good games for a change..like fifa2007 ,godfother ,fear or halflife2..will they work fine with this ati1400?
-
Hell, i can play all those games on my old desktop at high settings with ~30 fps. ANd that only has an amd 4200, 500 gig raid array, and an ati X800GTO along with the xfi platinum. (But The GFX and PRocessor ar massivly overclocked due to meh water cooling)
-
For those who have said the 2600XT mobile card isn't out - that is not technically true. It is out, but just not in the lighter weight notebooks (yet). It is out in the HP Pavilion HDX notebook (details and reviews below).
HP Pavilion HDX reviews:
http://laptopmag.com/Review/HP-Pavilion-HDX-Entertainment-Notebook-PC.htm
http://www.notebookreview.com/default.asp?newsID=3850
http://www.xtremesystems.org/forums/archive/index.php/t-143788.html
http://www.pcworld.com/product/specs/id,30143-c,notebooks/specs.html http://www.pcworld.com/article/id,135061-c,notebooks/article.html
http://reviews.cnet.com/laptops/hp-pavilion-hdx/4505-3121_7-32442902.html (the Cnet review should be taken with a grain of salt. Among its many faults they make comparisons to systems they shouldn't - desktop etc)
Like the mobile 8x00 cards it is only 128-bit memory. Testing that has been done on the HDX system shows that it has higher scores than the 8600 in 3D Mark tests (which as many people like to note - don't necessarily mean much). I've included the numbers and links below (including FPS tests done in FEAR, Doom 3, and Far Cry).
9,033 in 05 (notebook review)
4,205 in 06 (notebook review)
12,240 in 03 (notebookcheck)
9,030 in 05 (notebookcheck)
4,002 in 06 (notebookcheck)
11,984 in 03 (laptopmag)
4,241 in 06 (laptopmag)
12,240 in 03 (extremesystems)
4,002 in 06 (extremesystems)
72 fps in FEAR autodetect (laptopmag)
25 fps inf FEAR max settings (laptopmag) (they blamed the drivers for this score).
31 fps in FEAR autodetect (extremesystems)
26 fps in FEAR max settings (extremesystems)
109.93 in Doom 3, 1024 by 768, 32-Bit (PCWorld)
131.54 in Far Cry, 1024 by 768, 32-Bit (PCWorld)
(see the reviews listed above and the link below for details)
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
The review above stated that the reviewed unit had 256MB, but HP's website shows a 512MB card in the system:
http://www.shopping.hp.com/webapp/shopping/popups/seriesMoreInfo.jsp?category=Graphics Card
http://www.shopping.hp.com/webapp/s...ame=HDX_series&a1=Usage&v1=Extreme+Multimedia
The card in the HP is also apparently DDR3 as opposed to most of the 8600 cards floating out there these days (excluding the ones in the Asus G1 and Mac Power2 systems).
(With updated drivers and more memory - tests on the current HP system might yield better results than those tested above, but we'll have to wait for reviews of the 512MB card to confirm).
The information above should provide a rough idea of how a 2600 might preform in a 15.4 or 17" notebook if/when they come out. -
Yea.. new drivers might help since the HD 2600XT has problems when AA&AF is enabled.
-
ShadowoftheSun Notebook Consultant
No, the reason the HD 2xxx series sucks with AA enabled is because, at the last minute, ATI cut out the hardware portion of the AA implementation with no explaination. All AA is therefore handled across the shader processors instead of the AA resolve hardware that has been present in all major modern graphics cards. Across the board, ALL HD 2xxx series cards lose to their competition with AA enabled, from the lowest end cards to the 2900XT 1GB GDDR4 monster.
More on topic, I think the 2600XT will be as good as the 8600M GT, maybe a bit better, but not substantially. The 2600XT is only slightly faster than the 8600 GT desktop (slower than 8600 GTS). At launch day the 2600XT was slower than the 7600 GT of last generation- luckily, with driver updates, the 2600 has taken a more reasonable performance spot. I think its unreasonable to expect the 2600 mobile to retain the performance of the 2600 desktop. Tests indicate that the 2600 XT draws ~30 more watts than the 8600 GT (which the mobile 8600 is based off of) at idle, and ~15 more at load. Considering the 8600 was cut down substantially in terms of clocks to make it fit into the 15.4" MXM-II form factor and power draw requirements, I think we'll see one of two things:
1) 2600XT retains most of the desktop version's power, but is too power hungry to fit into 15.4 inch notebooks so is relegated to 17" notebooks and above, or
2) 2600XT is cut down significantly to fit into the 15.4" form factor, sacrificing its performance edge and giving us a part with 8600M GT level performance.
EDIT: for those of you who are wondering about the lack of AA hardware, read about it here. -
mmm, but regardless of desktop power specifications the mobile HD2600xt does infact draw less power than the 8600M.
and if some of those test results were actually done at WSXGA+ (1680x1050) as indicated than 25FPS in F.E.A.R at max is pretty good imho. -
On-topic: performance will vary wildly from app to app. Since these parts are direct clones of their desktop counterparts simply binned for lower TDPs, having only mildly downwardly-adjusted clockspeeds, they should perform very closely to said desktop counterparts. -
ShadowoftheSun Notebook Consultant
That isn't to say there aren't benefits to performing AA on the shaders; as mentioned, it allows for programmable AA which future DX10 versions will improve upon. The "tenting" style of AA, as opposed to the traditional "box" type, does provide some benefits. The problem is that performance still suffers in games which use "traditional" AA.
Basically, the thesis of my post is this: whatever justification for the lack of hardware-based AA resolve is what it is. It may be that future games will improve upon this. However, based upon the only solid, reliable facts (which is to say, modern day, available games), the shader-based AA does not perform as well as traditional hardware-based methods.
As Anandtech states, -
ViciousXUSMC Master Viking NBR Reviewer
I have been thinking this card would beat the 8600gt before I even got my C90S, because it has more hardware power there by far, and also its built on a smaller process (65nm vs 80nm) so it should be able to run the same type of performance levels with a lower power draw and less heat disipation.
That means that it can be overclocked to better levels and still only be on par with the heat/power of the 8600gt.
So now its just a matter of if they got the drivers going right for it, as the super scaler of the ATI cards is much harder to make drivers for than the Nvidia setup. -
ShadowoftheSun Notebook Consultant
That's true vicious; I doubt any of us are expecting this part to be a complete and utter waste of time. It should be a good part; however, whether or not the lower process will be enough to overcome power draw issues without sacrificing performance is debatable. I personally think that power draw will be reasonable. On the other hand, it is quite easy to see the other side of the equation, that being that the desktop 8600 GTS is based on a larger process than the 2600 and still manages to outperform while drawing less power.
-
Here's the first MR HD 2600 benchmark score:
5302 marks in 3DMark05
Source: Notebookcheck.pl
Seems to be fairly equal (or slightly better) with the GF 8600M GT (with GDDR2 memory). -
masterchef341 The guy from The Notebook
i dont know... the 8600m gt (gddr3) gets over 8k in 3dmark 05... that doesn't seem close to me.
why we are using 3dmark 05 in the first place is beyond me. (and OH NO not notebook check - i hadn't noticed the source; stay away from them)
did anyone notice that the dell e1705 was whomping on the hd 2600 xt in the cnet bench? oh wait nevermind that does look like a FEAR driver issue.
still, in quake 4 the xt looks like its going to par the gt, as i have said all along. -
The Forerunner Notebook Virtuoso
Notebookcheck is terrible, terrible, and oh yeah terrible. Not a reliable source at all. Their benchmarking methods are horrendous and I believe none of them have taken a statistics class in their life to reflect true results because their method of compiling data is.............uhhh..........terrible.
-
they are good at give u specs on each mobile GPU though. like how many pixel processing unit, vertex units, stream processors etc.
-
masterchef341 The guy from The Notebook
i disagree, the new gpu's don't even have dedicated pixel processors or vertex processors. they just use stream processors that can do pixel, vertex, or geometry. notebookcheck just splits the number of stream processors in half and says half are pixel and half are vertex, and geometry doesn't exist.
thats not even close to correct.
plus- you can't even compare stream processors from nvidia and ati. they are implemented very differently. nvidia's stream processors are much better; ati's processors are much more numerous. its not a 1:1 thing.
so even if the data was accurate, it wouldn't be useful. and it isn't even accurate!
stay away from notebookcheck. -
Of course the reason for the change is programmability. Hence ATI's new CFAA method which allows developers to create their own completely programmable AA sampling patterns. Granted, R6xx loses more performance by applying AA than does G80, but again this decision was made for programmability, not as a performance-enhancement. -
-
masterchef341 The guy from The Notebook
the xt model scores close to the gt, and the non xt model scores close to the gs. is that what you are saying?
the cnet benchmarks are for the xt version. -
-
masterchef341 The guy from The Notebook
i know, but that means that at best, the hd 2600 xt is looking to par the 8600m gt, which performs just under the neck of the go 7900 gs. thats using the quake benchmark.
in FEAR, the hd 2600 xt is FAR behind the 7900 gs, while the 8600m gt is still sitting right on its neck.
the 8600m gt is not in the cnet benchmark, but that is how it benches.
that is all i was saying. all this hubbub about nanometers and 4x stream processors and in the end the card is a close match up to the 8600m gt in quake 4 and will most likely make a similar match in FEAR after driver fixes. -
As for the shader processor comparison, it would be interesting to revisit this topic in a few months time after drivers for both products have matured, and perhaps some more shader-intensive (or even true DX10) titles have been released. -
masterchef341 The guy from The Notebook
the truth of the matter is that nvidia and ati don't count shaders the same way.
people shouldn't be buying the 7900 gs anymore at this point. the only real contenders are the 8600m gt / hd 2600xt or a 7950 gt / gtx.
i wouldn't call it a higher market segment if the performance is only a few percentage points higher in older games. as newer games and drivers come out, i expect performance to equalize or that the 8600m gt / hd 2600 xt will surpass the 7900 gs.
the problem is that TDP cannot be directly measured by manufacturing process size. its a distant general indicator.
things like ROP/RBE are also only distant general indicators of performance.
the fact is that they are really close performers. -
The 9k 3dmark05 score is from JerryJ's preview of the Pavillion HDX
-
FEAR and Doom 3 will always be better with nvidia's card, the 8500GT beats the X1950XTX in Doom!!!!! (1024x768 and lower)
You also have to remember that most 8600M comes with "/%"/$%"/$"/$ DDR2 memory so if most manufacturers put GDDR3 in the 2600XT, it will take the lead.
ATI Mobility Radeon? HD 2600 vs. Nvidia Geforce 8600M GT 512MB
Discussion in 'Gaming (Software and Graphics Cards)' started by AdamW, Aug 3, 2007.