It's all in the question. Yes, I have read the sticky at the top. The main difference, from what I understood, was that the 8700 only had a 128 bit memory bus while the 7950 had a 256 bit one..so will two 8700 make up for it?
Would like if someone could give the answer/opinion with some explanation to their decision (Don't just write "7950GTX is better").
The Dell XPS M1730 will cost around $500 more (Australian Dollars), with all other specs almost the same.
This is what is advertised:
"Dual NVIDIA® GeForce® 8700M GT graphics with NVIDIA SLI Technology"
Is it better than
"512mB NVIDIA® GeForce® 7950 GTX DDR2"
EDIT: One more important question.
What benefits does a AGEIA PhyX Mobile Technology bring with it and is it worth and extra $250? (Aus Dollars)
I want notebook that can handle really good games at the max settings. Are there any other dealers in AUSTRALIA that can provide even better laptops for $3200 or less?
-
Definitely M1730 will be better than M1710 because of the advanced 8700M GT that too in SLI mode.... Not only that - the 8700m gt will benefit from future games as well compared to the 7950 GTX..... 2x128-bit GPUs in SLI mode will be faster than a single 256-bit GPU....
-
Are these 512mb 8700s or 256mb?
-
Dual 256MB NVIDIA® GeForce® 8700M GT
NVIDIA® SLI® Technology
AGEIA® PhysXTM Mobile TechnologyTM processing unit (<- what is the benefit of having this?)
The link for the laptop details is here:
http://www1.ap.dell.com/content/products/productdetails.aspx/xpsnb_m1730?c=au&cs=audhs1&l=en&s=dhs -
I saw that they made a dual 7950 GTX. So does that make the 7950 even better than dual 8700.
-
Im pretty sure that the dual 8700 has better shading than a dual 7950 so I personally would get a notebook with the dual 8700 with SLI compatibility.
-
Yes, the 8700 crushes the 7950 in shading power. I personally would choose dual 8700M-GT's over any other mobile setup right now, even though the 7950GTX SLI setup can be faster.
-
If you like playing FPS games with fancy explosions what your processing unit does is make the explosions more realistic with shrapnel processed with mathematical accuracy. Supreme Commander could make use of this as well since I've heard the game uses the CPU to calculate projectile arcs for the thousands of units it renders.
-
Charles P. Jefferies Lead Moderator Super Moderator
The PhysX cards in general are useless because basically no games use them. The games that do use them are marketing tools - gimmicks basically. Don't bother with them if you have to pay anything.
-
Unreal engine 3 supports PhysX, don't be surprised when Gears of War on PC supports the physics accelerator and gives us some awesome physics and explosions
-
If you get a quad-core CPU, the Unreal Engine 3 will take advantage of that for physics calculations using one of the cores... which is more cost effective and efficient.
A dual core CPU will still do physics calculations, but not as intense. -
the 7950 single can match and even succeed the dual 8700m.
-
No it cannot. If just looking at how different 3dmark06 scores you get, that should prove you wrong. I know that it's all about FPS in real life games, but what you're saying is just purely wrong.
-
And dexgo, if you have the chance i'd like to see what you get in CoD4 at 1600x1200 with everything at high, because i'd also very much like to prove your sig wrong. Not that it's a very big challenge
-
HOW DID YOU GET 50% OFF!? TELL ME!!
I am thinking about getting a laptop with a 7950 GTX (m1710 or rock/clevo equivilant) or an m1730, which would be the best option do you think?
i have read that 2 8700m GT's only effectively have 256mb VRAM, is this true?
on notebookcheck.net it says that a single 7950 GTX is better than 2 8700's so i am leaning towards that
thanks! -
Tomshardware did a comparison a month and a half ago...
http://www.tomsguide.com/us/sibling-rivalry,review-1012-13.html
The Dual 7950GTX can indeed outperform the dual 8700s in certain games... but anything shader intense like Oblivion was pretty well beaten by the dual 8700s.
FEAR was an oddity... if you want to play FEAR specifically, you want the dual 7950GTXs as for some reason the dual 8700s did not like it.
(The Dell plays it at barely 30+ fps at 1920x1200)
From those benchmarks, its pretty easy to tell that Dexgo is just flat out wrong. SLI 7950GTX are beat by the SLI 8700s even at 1600x1200 in Oblivion. If SLI 7950GTX are beat, a single 7950GTX will be destroyed. -
Well in non intensive shader games the 7950GTX beats the 8700GT and in high resolutions too where the 8700GT fails badly. Get a 8800M GTX while you´re at it, the 8700GT is no gaming card it´s a middle range card along with the 8600GT. Finally we have some real notebook DX10 gaming cards that can run DX10 in all it´s full glory. No use to get the 8700GT now when you have this killer card out.
-
Yes, it totally can buddy.
I am serious.
My single 7900gtx can beat the 2x256mb @ any resolution over 1600x xxx
If* you can even get it working in SLI it will still beat it at high res.
3dmark is BS! who the hell plays at 1280x1024 in gaming notebooks. REALY!?
Run 3dmark @1900x1200. even with both your cards and see what you get.
I get 30fps average on crysis @medium shaders and shadows and all other settings to high and res @ 1600x1050 res.
the dual 8700's are crap @ high res gaming.
2007-12-10 21:40:36 - Crysis
Frames: 1079 - Time: 39177ms - Avg: 27.542 - Min: 17 - Max: 38
-1900x1200 medium shaders medium shadows all other settings to HIGH.
2007-12-11 23:10:47 - Crysis
Frames: 1828 - Time: 60000ms - Avg: 30.467 - Min: 14 - Max: 47
2007-12-11 23:12:00 - Crysis
Frames: 818 - Time: 27293ms - Avg: 29.971 - Min: 22 - Max: 51
2007-12-11 23:12:41 - Crysis
Frames: 989 - Time: 33557ms - Avg: 29.472 - Min: 22 - Max: 52
I ran it @1680x1050 All settings to High except tweaked Shaders, Shadows to medium -
A 7900GTX Oced out of its pants, since my 7950GTX CANNOT play Crysis at those specs.I`ve finished the game and there are levels were I actually had to go to all LOW and 1280 to get decent framerates.
But I don`t OC my gpu, never will. -
Yeah, dexgo, I'm assuming your GPU is OC'd. If so, compare it to similarly OC'd dual 8700M-GT's and yours will still loose.
-
what the hell do you mean Still loose? It can't beat me at high res to begin with. that is what I have been trying to tell you people.
128 bit bus. it just can't hack it at high res.
yes my card is oc'd
I love the m1730 and will most def probably get one.
but the 8700's are not that great at all.
2x 7950's or 7900s would kick the crap out of my card. that's 2x512 too.
not 2x256 cards 8700 128 bus
There is a fraps thread over @ notebookforums dell forum odin.
there is a guy in the thread that posted his OC'd results in the FRAPS thread.
It doesn't beat my card in FRAPS vs Crysis.
lol, you crack me up.
STILL loose?? you say??
I Beat the m1730 as it stands now. and OC"d
what's funny is.
there can be sooo much proof that a single card is better than the 8700 series sli or otherwize.
and as soon as you say or even prove it. people just defend it. even with proof.
oh, and odin.
look @ chaz'ez FRAPS for reference.
I will post my 1440 benchmarks. -
here is the guy who just posted his crysis benchmarks for m1730
OK, here are some results with overclocking the CPU from 2.8GHz to 3.4GHz:
DX9 SLI 800x600, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, Medium Settings, CPU O/C:3.4GHz
2007-12-11 21:47:57 - Crysis
Frames: 2165 - Time: 39848ms - Avg: 54.331 - Min: 32 - Max: 89
DX9 SLI 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, Medium Settings, CPU O/C:3.4GHz
2007-12-11 21:50:28 - Crysis
Frames: 2048 - Time: 100621ms - Avg: 20.353 - Min: 15 - Max: 32
At the higher res, the Frame Rate is the same as the non-O/C'd results, so I am video card limited. It did help at the lower setting.
Here are the results from my earlier non-O/C'd test:
DX9 SLI, 800x600, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, Medium Settings, No O/C
2007-12-10 21:40:37 - Crysis
Frames: 1939 - Time: 39425ms - Avg: 49.181 - Min: 28 - Max: 72
DX9 SLI 1900x1200, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, Medium Settings, No O/C
2007-12-10 21:42:04 - Crysis
Frames: 1966 - Time: 97822ms - Avg: 20.097 - Min: 14 - Max: 28 -
1440x900
2007-12-10 21:50:37 - Crysis
Frames: 1503 - Time: 32036ms - Avg: 46.916 - Min: 21 - Max: 62
medium shaders/shadows every other setting HIGH -
here is the guy's post about the m1730 @1440x900 SLI
Here is the 1440x900 numbers:
DX10 SLI, 1440x900, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, No O/C
2007-12-11 07:33:34 - Crysis
Frames: 1969 - Time: 82507ms - Avg: 23.864 - Min: 16 - Max: 33
DX9 SLI, 1440x900, AA=No AA, Vsync=Disabled, 32 bit test, FullScreen, No O/C
2007-12-11 07:38:42 - Crysis
Frames: 1963 - Time: 58740ms - Avg: 33.418 - Min: 25 - Max: 43
Edit: All at medium settings -
the card just isn't going to hack it.
I'm not all King of the hill or nuttin. It may seem like it.
I just am proving it.
when the 8800's come out. the mid-range 8700 will be forgotten. -
my GPU is arctic silvered. I keep all fans cleared and replaced as required.
I run a 150w PSU, and my card never skips a beat.
I also have a back up 7900gtx just in case. but I treat my Vid card very well.
I also did a Volt-Mod to 1.32v -
Doesn´t matter you arctic silvered it, you have overvolted your GPU. I have read many stories on notebookforums about their 7900 cards suddenly just stops working. But anyway I´m into overclocking too, so with a backup card you have Dexgo it shouldn´t be any problem. Besides I have ran my go 7800GTX overclocked for over a year now without any problems.
Though from what I have read on the notebookforums it doesn´t seem like the 7900 cards uses the same memory chips as the go 7800GTX. It seems like it is easier to fry the memory chips on the 7900 than on the 7800GTX. -
the 7900 has 2 types I have the better type there is 2 revs
-
Mind telling me which types are those? 7900 GTX and 7950GTX ?
-
rev 0, rev 1 7900gtx.
I never ever go by 3dmark because who games @ 1280 anyways?
I game a 1600x900 or 1900x1200.
you see a 8700's go there and they bomb.
My clocks are 650/750. @ 1.32v
my other one is 650/800 @1.24v -
What revisions are you talking about? G71M didn't have a core revision that I remember hearing about. Do you have a link to an explanation?
-
there is for sure 2 revisions of 7900gtx.
Atleast for Dell, it says on the cards and the bios are different.
there is information about it on notebookfourms.
I will dig up a thread for you. -
here is the post of someone :
My card is a A01 revision as well, so I uploaded it here and had someone unlock it for me since its not on that "unlocked BIOS" ISO.
Heres my thread. The unlocked A01 7900GTX BIOS is listed in post #12.
http://www.notebookforums.com/showthread.php?t=204426&highlight=unlock+7900gtx+a01
my rev is a00 -
I don't know about the core. but the memory is different. From what I've read.
There is 2 revs of the Fx2500 for Dell too.
the other card I have is Rev A01 fx2500. with g71 core.
it isn't as good in comparison to Realworld benchmarking stock/oc'd that I have done against my own 7900gtx rev00.
might not mean much though -
I am curious Dexgo... maybe you just have the uber-super-version-7900GTX-with-turbo-button.
Run the benchmarks tom ran... let's see where you score.
especially the 1600x1200 and above ones.
Doom3 seems fair enough...
timedemo1 at 1600x1200 no AA/8xFA
timedemo1 at 1920x1200 no AA/8xFA
Try a non-overclocked and an overclocked run.
The SLI 7950s did indeed beat the SLI8700s in this test so it isn't as shader intensive as Oblivion and isn't bug-limited like FEAR was. It also is a "real" game and not 3dmark06.
Let's try some reproduceable benchmarks shall we?
If what you say is true your 7900 should beat the SLI8700 at those resolutions. -
Screw Toms.
I posted Crysis.
I have posted a ton of oblivion.
Seriously. I don't need to prove it anymore.
I am done. The 8800gtx's will come out and then the 7900's and 8700's of the world will be forgotten.
I have posted tons of FRAPS benchmarks on notebookforums and my share here.
There is oblivion Crysis, Graw2 Call of Jaurez Bioshock etc etc. all over @notebook forums in the FRAPS thread.
along with comparisons of people with m1730 and dual 8700s.
get the m1730 with dual 8700's and you post some benchmarks.
my crusade is over. you can take the red pill and believe whatever you want to believe.
i'm out.
/thread -
and my single 7900 beats the m1730 @crysis at the resolutions I posted bud.
seriously.... this is getting real trite.
Crysis is the worlds best benchmark right now. -
Ahh, you mean Dell's VBIOS revisions. The card itself didn't change.
And dexgo, I've yet to seen a single reliable benchmark showing your card beating an M1730, so you haven't proven anything. -
stop defending the SLI 8700's will ya? dexgo has proven his point people..
-
man why did u revive this, we all now know 8700gt's were a bit of a flop.
OFFTOPIC: 1000TH EFFING POSTTT WOOOOOOOOOOOOOO -
How about Lost Planet? My 8700GT SLI setup runs Lost Planet @ 1920x1200 all maxed except HDR set to Medium and with 8xAF at 30+ fps and that´s in DX10
I run Crysis at 1680x1050 with 25-40 fps with a heavily modified cfg that puts it quite a bit above Medium settings, more close to High settings
Gears of War all maxed 30-40+ fps @ 1920x1200
I recently got this XPS M1730 and I can tell other people with an XPS M1730 and the SLI setup download the 171.16 Vista drivers, they give a huge boost to every game. I was amazed going from 169.09 to these beta 171.16 drivers.
Here is an video I recorded with the cell phone, though the quality got worse when I posted it on youtube. However the FPS ranges from 25 in firefights with several enemies to 40 fps running around in the forest. So frames ranging from 25-40 fps and with very good quality. I will post some screens later on so you can see how it looks like with the cfg, which by the way is a really good one. I can post the CFG too.
http://www.youtube.com/watch?v=Lvp2h72H_pI
Which is better on the whole - 7950 GTX or *DUAL* 8700 GT SLI (dell xps m1710 vs m1730)
Discussion in 'Gaming (Software and Graphics Cards)' started by dna2008, Oct 2, 2007.