Actually, hasn't it been confirmed that the 9500m in the XPS 13 is just a 9200m IGP and 9400M IGP working in geforce boost mode?
-
Notebookcheck is not reliable. Try searching the forums. It is pretty much impossible for the laptop version of 9600M GT (FX 770M) to beat the 9800 GTX (desktop) since the 9600M GT is based off the 9500 GT (desktop) core, even if the test is based off OpenGL as the 9800 GTX (desktop) has much more shaders and higher memory bandwidth.
-
Just to clarify, the DDR2 9600 is very much equal to the DDR3 3670. In the XPS 16 review, I can confirm the HDX18's score of 4127 was at 1280 x 1024. The XPS scored 4855, but assumedly at 1280 x 800 (since it was said every score was at 1280 x 800).
-
Notebookcheck is just a collector of information from other websites, including this one, so the reliability depends on the sources. In this particular case their info is correct. You just don't understand. It doesn't matter how much shaders, bandwidth or power the 9800GTX has. Nvidia kills the OpenGL and 3D App performance of the Geforce card on driver level. The Geforce card will also make errors in rendering.
I don't want to start a quadro vs geforce discussion because there are already millions around the internet, just search and you will find out. Let's assume I'm wrong, then ask yourself the question why people would pay $600
for a midrange quadro if they can get a ultra high-end gaming card for the same price.
I just wanted to give this guy some advice. If 3D apps is all he wants to do, better spend the money on an older midrange Quadro or FireGL because those are already better then ANY Geforce for that purpose. -
Red_Dragon Notebook Nobel Laureate
yes so NBR review says i wonder of other reviewers are having this problem? -
Hm, that would be rather interesting on what the specs are.
-
No, the Geforce 9500M GS is a slightly improved Geforce 8600M GT DDR2. The Geforce 9200M IGP falls somewhere under the Geforce 8400M G which was rebranded as the Geforce 9300M GS; unless its a Dell trick to say a Geforce 9200M + 9400M = Geforce 9500M in performance, which would be false.
-
Quick question, which one of these performs the best
ATI HD 3450 or a Nvidia 9400m
im eyeing up a new system and currently thinking of either a Dell Studio 15 or 17 or a apple macbook wondering which GPU is better out of the two of them -
Charles P. Jefferies Lead Moderator Super Moderator
The HD 3200 integrated card is more or less equal in performance to the HD 3450. The 9400M isn't much different performance-wise. As long as the laptop has one of those cards, then I'd decide on other features. -
Any news on new nvidia cards or nuttin released yet ?
-
Well, I got my netbook in, (not really a netbook, but its small enough, and way smaller and lighter than my XPS...) and it has a X4500MHD in it. I always saw this card getting trounced by people around the forums for being bad, but I figured hey, I am only going to install WoW, my word processor for school, Firefox and Winamp, and I have 80 something gigs left over... what the hell, why not try out Steam, right? Well, WoW wouldn't install, (ended up being something I was doing, not the GPU) and I saw someone here say that Half-Life 2 was barely playable at low. I download it from Steam and chuckle to myself cause I figure I'll end up running back to my XPS in tears, but when I hit the Video tab, the game preset everything to high. Very odd, I thought, but I went ahead and played it at the settings it recommended. To my surprise, it played. And well! It was fluent, and I didn't download FRAPs, but it appeared to be at least 30fps because it wasnt feeling choppy or laggy! I even contemplated a video for the non-believers, but I don't feel I have to prove it to anyone, so I just canned the idea. Then, I tried Left 4 Dead, a game I was SURE wouldn't run. But again, to my surprise, it ran, and pretty well. All graphics set to as low as possible, and running in windowed mode, it actually stayed playable. Enough so that I completed the entire Dead Air campaign, through the hordes and everything! When there was a horde and people were throwing Molotovs was the worst, and it did have some stuttering, but it was still able to be played through!
I still wouldn't recommend this card to anyone serious about gaming- its absolutely no fun worrying about being able to play the next big title at low settings, but I also was surprised at how well the card managed to do! -
It's definately not the worst mobile graphics out there. It stomps the old x3100 and Nvidia's 7150go both with ease.
-
Yes it definitely is not terrible but are you sure Left 4 Dead is playable Im running it on my desktop with a Q6600 processor and I cant even run the game consistently
Half Life and CS are possible though under lower resolutions -
Templesa - Thanks for the info on the x4500MHD - I just ordered a Toshiba U405 with (I think) the same graphics in it. Would you mind telling what system you have? Thanks...
-
P8400 2.26 Ghz C2D w/ 3MB L2 Cache 1066Mhz FSB, 4GB of DDRII-800Mhz G-Skill RAM, X4500MHD, 120GB 5400 RPM HDD. This is all in a MSI 1223 (whitebook). Screen is 12" 1280x800 Resolution. Anything else you'd like to know, I'd like to share!
And yes, I am sure it was playing fine because I am usually pretty picky with the way things run. Also remember though, that this was windowed at 640x320 res and lowest graphical setting. It looks nothing like it would on your quad-core, since there were many unsmoothed jaggies, and some detail missing, but it did run well. Putting it into fullscreen mode made it hiccup like crazy. I am even going to try turning up a few of the options and play around with it to see if I can maybe squeeze a tiny tiny bit more out of it. -
His quad core could be playing it on integrated graphics with a Dell desktop, so that doesn't mean much.
-
Hehe, in a way I have no idea why this was moved. It wasnt a question at all.
-
Anyone know have one and knows how well it performs? Is it a dedicated card? Could it play counter strike source? Im looking to buy one thats in a ASUS N10J-A1. thanks
-
yep it can play source, but dont expect it to do well with new released games
-
Hey..
I didn't see the ATI Mobilitiy RADEON HD 3670 on the chart, where does that compare? -
Charles P. Jefferies Lead Moderator Super Moderator
It is slightly better than the HD 3650. The HD 3670 is a decent card, and it can play all modern games at medium resolution and mostly high settings. -
Ok, thanks!
-
9600m GT vs HD 3670 which one is better in gameplay??
-
DDR2 9600 and it's a tie, DDR3 9600 is better. Wait for the 4670
-
Just how big an improvement would a Quadro FX 2700M be over a Mobility Radeon 3650? I'm not talking about trying to jack up the framerate in Crysis. I'm talking about maximizing the graphics quality settings in games like Fallout 3 and Diablo III at WSXGA+.
-
The Quadro FX 2700M is based off the Geforce 9700M GTS, so it's a lot better than the HD 3650. I'd expect it to be at least 80% faster.
-
Seriously? 80%?
-
No-no-no not 80% :facepalm:
Up to 50% at a higher resolution but less at a lower one (talking about playable framerates here)
You won't be maxing fallout3 out @ WSXGA+, Diablo 3 will depend on processor alot too so i doubt that'll max out either. -
I have been looking at lots of laptops with each of these and was just wondering which card is better. I have seen that the 2600HD has 512mb dedicated while I haven't seen if the 3200HD does or not.
-
The HD3200 is an IGP, while the HD2600 is a dedicated gpu. The HD2600 scores almost double what the 3200 does in 3DMark06, making it a much better alternative if casual gaming......
-
Ok thank you very much.
-
Crysis - CPU Benchmark: - Low 1024x768 = 43.33 (DDR2) vs 74.34 = 71.6%
Crysis - CPU Benchmark: - High 1024x768 = 10.84 (DDR2) vs 17.82 = 64.4%
Doom 3: - Ultra 1024x768 = 93.1 (DDR3) vs 122 = 31%
Depends on what VRAM the HD 3650 has. -
I have a hd2600 256 mb. Although i will be buying new notobook later this year. I would say if you cant afford something with newer GPU the 2600 is good value for money.
I play mostly shooters.below are titles/settings.
COD4/5-1024/800 MED-HIGH
BIOSHOCK-1024/800 HIGH
GEARS OF WAR-1024/800 MED
CRYSIS/WARHEAD-1024/800 MED
DEADSPACE-1024/800 HIGH
FEAR2 DEMO-1024/768?800? HIGH -
I've seen a lot of people saying that the data on Notebookcheck can't be trusted. Is that because it's based on anecdotal user-submitted data or something? (I *am* curious what the CPU Benchmark has to do with the GPU comparison, though.)
-
I haven't found any gross inaccuracies in the GPU tables (though they can be influenced on the CPU used as I'm sure all those tests weren't used with a single CPU) but the CPU table is definately based on benchmarks that either aren't done well or don't properly utilize L2 cache for one thing, you find lots of lower end (T3XXX over T5XXX and T7XXXs with the same MHZ etc) cpus beating a higher end cpu with the same clock. On a test that doesn't use much cache that's entirely possible, but real applications that make use of the cache will obviously run faster on the newer cpu with larger L2 cache. The gap in day to day usage, especially with less demanding apps will be virtually nil from a 2ghz Pentium Dual Core to a Core 2 Duo at the same clock, but any kind of demanding app that uses L2 will run much faster on the C2D.
-
HD2600 is a way better choice.
Radeon HD3200 is in range of HD2400, which puts in in low range GPU's class. -
The data is somewhat decent, just that it should be taken with a grain of salt. As you said, sometimes they base their information off "on sheet" specs and in their tests are also influenced by the CPU sometimes(ex: 3dmark scores or some games are influenced by the CPU). They also do not make any difference between a DDR2 and GDDR3 version of the same GPU, taking the average of both of these in their charts.
You can use Notebookcheck, but you have to know how to. In order to do it properly one has to:
- check the actual laptops notebookcheck has used in their "benchmarking"(they usually state them under the games or somewhere)
- check the versions of the GPUs/CPUs used
- check what tests are being done and take that into account
The table in general isn't that badly made. It's a good rough outline(as in, the general list from good to bad is correct), but I wouldn't go into detail and say "oh this one is ranked 4 spots higher in notebookcheck so it must be loads better". Notebookcheck is mostly used to check the specs of the GPU, which are in general accurate(ex: core clocks and shaders). THOSE specs are used to compare GPUs between each other. -
Got an m1530 with an 8600m GT DDR3. While playing Crysis Warhead GPU temp reached 84C and CPU temp gets to 80C. Every 2 minutes the friggin game downclocks to like 3-4 fps for like 40 sec before going back to a smooth 25-30 fps...
Is it supposed to be doing this? Shouldn't a laptop GPU be able to push 95C without it being too dangerous? I'm currently using the 179.28 official beta drivers from nvidia.
Part of it is that the desk in my dorm room is pretty crowded...the desk at my house is not and i've noticed lower idle/gaming temps there. Would something like a USB cooling pad help a lot with the random downclocks? -
Question 1: How Large a difference is there between the gaming/overall performance of each card. Is one significantly better then the other.
Question 2: Which Card will play games such as Crysis, GRID, and Dead Space better, and how much better will it play them.
Question 3: Which Card do you Overall recommend to use. Does one have advantages over another?
Question 4: How big a difference would performance be if the Nvidia laptop was running P9500(2.53ghz) instead of P8700(2.53ghz). Would the tables change for the above questions?
Thanks
*btw, I know that there are similar posts to this one but I want to be a 100% sure.
*Both Card have 256mb Vram
*The Nvidia will run on a laptop with 3GB DDR3 ram where as the Ati will run on a laptop with 4GB DDR2 ram.
*The Nvidia will run on a P8700 2.53 Ghz laptop where as the Ati will run on a T9600 2.80 Ghz laptop.
*both laptops are 7200rpm -
Elevate the back of the laptop and make sure the fan has plenty of breathing space. A laptop cooler would probably help a bit, so would undervolting the cpu.
Cooling central -
*Bump*
Im considering overclocking my 9200m GS a bit to get a tinybit more power out of it,
but wheres the limit temperaturewise?
-
i've tried dead space on 9300gs with dc3200 and 2gigs of ram...
1024x600 med/high... it can run on 1280x800 witch is native resolution for that acer 15.4' model... bud its little laggish..
i tried left4dead... 1024x600 med, around 30fps... NFS UC low/med 1024x600 runs good. Test Drive Unlimited... 1024x600 with desireable framerate
say godbye for crysis and maybe grid coz this graphic cards have only 64bit...
FIFA09 runs like crap on high
q 2: i would recommend u to go with Nvidia in this case, as far as i know 3470 is so so so so low end card....
http://www.notebookcheck.net/Mobile-Graphics-Cards-Benchmark-List.844.0.html
question 1: overall difference for everyday work and stuff like that? none i think
q. 4: well... p9500 have 6mb l2 cache and p8700 only 3
so figure it out yourself
.
-
I believe the 3470 is actually better than the 9300M GS, but not significantly so. Notebookcheck can be a little unreliable. Also, processor makes little difference in most games.
Video card memory makes little difference as both are 64-bit. Even 128MB will suffice. The difference is if it's DDR2 GPU memory or GDDR3. Laptop memory (DDR2/3 SO-DIMM) makes little impact on performance, 3 and 4GB are both good. From your updated specs, ATI should win by a decent margin. -
below 90c is good. the absolute limit is 100c
-
i doubt it but ok...
processor would make huge impact on performance for RTS games...
but again, nvidia have better driver support what is crucial for me for example... -
hmm, considdering its barely tickling 70'c at most it seems I can give her some gas then.
Howfar you reccon a 9200m GS can safely go though?
-
well the engineering max temp a video card can take is 110-120 c, but as long as its below 90, its good
-
Does anyone know how good this card is at CAD applications like solidworks?
-
The 9200M GS is a good overclocker.
Are you serious when you say the vent singes your hair???
Check the link in my sig for a good (not max) OC and the benefits, but remember 2 9200m gs can have different limits and 'the danger of overclocking' (oooh scary xD)
UPDATED - The Mobile Graphics Card Info Page - Most GPU Qs answered
Discussion in 'Gaming (Software and Graphics Cards)' started by Charles P. Jefferies, Feb 4, 2006.