The bandwith increase does indeed make a difference, however you're correct in that the bandwith increase is directly due to the fact that the memory is clocked 200MHz higher @ stock. (As Bandwith = Bus Width * Bus (Clock)Speed)
-
-
I think most people still have the mis-conception that "memory-bus-bit/bandwidth"
is still directly proportional to card performance in new-generation cards.
With the new nvidia 8x00 unified architecture, this not longer holds true,
so stop saying this is the #1 factor limiting the performance of 8700m.
As you can see, the 8800 Ultra desktop cards have 384bit mem bus,
but in this anantech test, they follow the same downward pattern in both Direct 9 and 10 games as 8700m which has only 128bit bus:
http://www.anandtech.com/video/showdoc.aspx?i=3029&p=5
In this case, the having of 384bit mem does not prevent the 8800 ultra from the drastic drop from 100+fps to 30+fps when moving to higher resolutions.
What I am saying is that mem-bus is no longer the #1 factor affecting performance in unified architecture.
It's something else that need to be optimized by Nvidia, and ATI when using unified architecture!
- which is what we like nVidia to fulfill their promise with better firmware/driver!
(It's not a simple matter of mem-bus!) -
-
Study this more carefully before the deduction
http://www.anandtech.com/video/showdoc.aspx?i=3029&p=5
100+ fps (800x600) to 30+ fps (1920x1200)
That's a 70% drop even for a 384-bit bus of 8800 Ultra in both directx 9 and 10,
and whether there is 4xAA or 0xAA.
What I am saying is that memory-bus limitation is not the #1 reason for this 70% drop!
It's the new-generation "unified architecture" that is not optimized for current-generation games!
which nVidia and ATI has to work with microsoft and games developer to optimize their game codes, drivers!
(If it is a simple matter of memory limitation, I would expect the 8800 Ultra's 384-bit bus, to at least stays almost flat from 800x600 to 1280x1024 and then start the drastic downward trend at much higher resolutions,
but it drop drastically even from low resolutions of 800x600 to 1024x768, which a 7x00 series can handle flatout!)
-
-
Yes full agreed with you about huge pixel counts,
But if it's a simple mem-bus limitation, for 384-bit 8800 Ultra at least,
we would expect the curve to stay somewhat gradual/flat at least at the low resolutions parts,
and only begin to drop drastically when it went into higher resolutions because it's memory-threshold-limit is hit.
It's simply not this case - it begin to drop drastically right from 800x600 (both 4xaa and 0xaa).
If fact, if you look at the graphs carefully, ironically,
the sharpest drop is at the lower resolutions parts, and then more gradual drop towards the highest resolutions.
Something else is inducing the performance drop even at lowest resolutions of just 800x600.
I can only infer it's due to non-optimization of the "unified execution architecture" of the new-generation cards in current-generation games.;
and not due primarily to mem-bus limitations.
(btw, can look at the other games in the same link as well at the bottom menus)
-
(Resolution: High Settings|Medium Settings|Low Settings)
1920x1200: ~45fps | ~53fps | ~81fps
1600x1200: ~54fps | ~64fps | ~98fps
1280x1024: ~76fps | ~91fps | ~138fps
From this we see that (using 1280x1024 as a base res and high settings) a 46% pixel increase results in a 40% decrease in speed, and a 76% pixel count increase results in a 69% decrease in speed, which is nearly perfect scaling. Similar scaling is shown at lower settings (though to a lesser extent the further settings are lowered).
Edit: Sorry for not mentioning it, this is based on Legion Hardware's reviews using stock drivers (NOT the Bioshock BETA ones). Link is here:
http://www.legionhardware.com/document.php?id=681&p=0 -
Well... if most people fails to recognize the bigger problem beyond mem-bus limitation,
nVidia and ATI will have an easy job...
It's very obvious that at 800x600 with 0xAA, a 384-bit 8800 Ultra is not bottlenecked due to the bandwidth.
-
-
Well if you see my other post,
the architecture has changed drastically between pre-8x00 series cards and post-8x00 cards....
It's not a simple game of memory-bandwidth anymore, as most people still primarily associate with.
A lot more areas can be and Yet-to-be optimized or are limiting factors, yes the games codes,
but also drivers/firmware from ATI and NVidia, and Microsoft codes! -
-
Most people plays at 1024x768, 1280x800, 1440x900 or 1680x1024 where other factors are more significant than mem-bus.
If mem-bus is that important, nVidia and ATI should have more easily spent their main R&D in improving mem-bus limitations, rather than wasting big big resource to completely overhaul the unified execution architecture.
or ATI could have easily implemented the EDRAM (like they had done years ago in Xbox360 giving 256.00GB/s mem bandwidth) in their next-gen cards instead of unified shader and beat the hell out of nVidia!
-
-
The Radeon 2900's memory interface is doubled from the previous generation (512-bit, plus all those tweaks they have made within the internal bus; http://www.anandtech.com/video/showdoc.aspx?i=2988&p=9), nVIDIA has also beefed up theirs too to 384-bit. So I guess they memory bandwidth do mean something or else why bother right..
-
What is a puzzle to me is ATI already have the technology years ago to solve the memory bandwidth issue with its xbox360 chip with EDRAM which effectively give 256.00GB/sec bandwidth, which is way much more than that of radeon 2900's 512bit can offer....
There must be other factors more significant than mem...
-
omg look at this
http://forum.notebookreview.com/showthread.php?t=172564&highlight=8600m+gt+3dmark
OC'd 8600m GT just got me a 5100+ 3DMark score!
as good as a 7950!
I can't imagine sli dual 8600m GT in the next x205! -
-
Whatever it means to a 8600m GT owner...
vs
Whatever it means to a 7950 owner...
Clearly I am amazed at what a oc'ed 8600m GT can achieved a par 7950-level benchmark.
Well done.
-
i'm a 8600 owner and it doesn't mean jack **** to me
-
seem to me if u could oc'ed like he/she did, your performance will surely increase....and he maintained it at around 80C temp which is quite stable...
-
yeah, well, i don't think an average increase of 5 fps is worth it, but again, that's just me.
-
depends o how much FPS you're getting
from 10 to 15 FPS is a big jump
from 50 to 55 is a lot less significant -
well, agree, but u probably won't see such a big jump from 10 to 15 fps, if it's that bad, it's bad...unless u just overclock it to the max
-
Well I just got my M570RU with the 512 MB 8700M GT and got 5384 on 3D Mark 06 all stock on the VC and 3D Mark on a fresh XP Pro install.
I like it. -
I will do one at 1280x1024 the default was 1280x768
-
-
1440x900 did one last night and got 47xx at 1260x1024 I'm in the norm with my setup
-
-
Hi guys,
I have been reading these posts trying to figure out what to do about my current situation, but I am lost and hoping you might help me decide. I just bought a new m9750 and I spared no expense, it is maxed out. I chose the dual 512MB 8700M GT's and I think I may have made a mistake. 5 minutes after I took the thing out of the box and before I installed any games or made any mods to the factory config, I received my 1st "Video Hardware Failure" message. I thought it may have been a fluke so I went ahead and began the install of some of my games, to include Crysis, COD4:MW and R6V to try this baby out and get some glorius gaming......Well after the Crysis install I let Crysis choose the optimal settings and I put the resolution at 1600x1200 (other settings on high) and began to load the game. As soon as the game loaded and I was in freefall heading towards the water, I could tell that this notebook wasn't going to be able to handle that game. The game play itself was totally choppy and absolutely horrid. If I made to quick of mouse movements, the game would freeze. I sometimes had crashes to the desktop, or the screen would just go black and whatever audio got captured at the time would repeat like a broken record. I tried to fix these issues by adjusting the res, setting the settings to med/low, but to know avail. The game persisted to be unplayable due to chop, stuttering graphics and just overall complete failure. It was a real bummer. I tried my other games and did not have any better results. COD4 played as Crysis did initially, but with lowering the settings, it was playable, but the graphics were so horrible I didn't want to play because it looked like crap.
My other games like BF2 seemed to play fine with all the settings maxed out, but that was the only game I had any luck with. I bought the 9750 because it was advertised on the Alienware site and on the Crysis site as the notebook designed to handle the new DX10 technology and the Crysis engine, but that couldn't be farther from the truth. After trouble shooting with Alienware for a week trying to get this fixed, they finally had me send it to them for repair. Before sending it back to them I tried different drivers, I tried installing XP Pro instead of Vista, and every other typical method most commonly used when troubleshooting, but nothing worked. Well Alienware in their infinite wisdom received my notebook on the 12th, reinstalled the OS, and sent my notebook back to me. It's due to arrive tomorrow, but I know for a fact it's going to have the same issues it had prior to me sending to them, and now I am not sure on what to do. It began to fail right out of the box! They set it just as they did when I got it the first time and I'm really pissed. If I still experience these game play issues I could return the notebook for a refund and move on. Or, I could send it to them and try to have the 7950 vid cards installed. Or, I could maybe ask to replace the RAM, mobo, etc... but I really don't know what the issue is, and it appears they do not as well. I really want to keep my m9750, but I want it to perform as advertised, and as all the reviews raved about.
What are your suggestions? What do you think the problem could be? What can I do to get this notebook to be the gamer it's supposed to be? Please help me if you can. As of tomorrow I have 4 days to decide on how to handle this. I really want to keep it, but it just can't handle the newer games that are being released, at least the one I had couldn't. Do you think I may have just had some bad luck and got a lemon? Should I have them send me another identical system (if they will) or should I cut my losses and wait for the technology to get unf**ked? --- Here is my system:
Manufacturer: Alienware m9750
Processor: Intel(R) Core(TM)2 CPU T7600 @ 2.33GHz (2 CPUs), ~2.3GHz
Memory: 4GB RAM
Hard Drive: 160 GB Extreme Performance RAID (80GB x 2)
Video Card: Dual 512MB NVIDIA GeForce 8700M GT
Monitor: 17" WideUXGA 1920 x 1200 LCD with Clearview Technology
Sound Card: SB Xtreme Audio Notebook X-Fi (Vista)
Speakers/Headphones: SENNHEISER PC166USB Gamer Headphones
Keyboard: Alienware Standard
Mouse: Logitech G-5 / Saitek X-52
Mouse Surface: Acrylic ICE precision surface, optical/laser
Operating System: Windows Vista™
Any help would be really appreciated. I'm stuck! (maybe I should send back and wait for the M17x?)
Thank you. -
What graphics driver version were you using?
-
I tried the Drivers that came preinstalled 7.15.11.5672, and the Xtreme G Drivers at tweakforce.com, neither helped, had the same issues. I just got it back tonight from AW tech repair and it still isn't fixed. All they did was reinstall the OS, lol. I had a page of error logs and it wasn't the OS. I was getting hardware failure errors, amongst the blue screen display driver crashes, etc...I dunno, its boxed and ready for return. I emailed AW about the m17x coming out and they sent me this link: http://www.alienware.com/intro_pages/m17x_m15x.aspx
and said that they would be out in Jan-Feb, I will wait to buy the m17x. Thanks though... -
I wish they offered this on the 1520/1720...
Any way to find one and install it in place of the 8600m? Same size, etc. -
-
Hi,
I am going to overclock for the first time the standard dual 8700m.
Anyone has any idea about stable OC speed? I've been reading of people OCing at 720 Core / 1.2 Gigs o'Memory. Any help, hint rumour, link would be much appreciated.
I am running 169.04 drivers (best one so far) on a 2.4 CPU on 2 Gigs RAM.
Cheers,
JI -
You will have to test it yourself...each card has different overclocking capabilities....while my 8700 is stable at 750/1850/1000 it doesn´t mean yours will be...it could withstand higher clocks or lower ones.
-
I am using rivaTuner 2.06, and before overclocking I want to make sure I am able to track GPU temperature. RivaTuner detects two GPU cores but the temperature shown is always the same for both (it changes but the them are always equal), so I am worrying it's picking the same temperature and displaying it for both cores...
Also I am not sure what is gonna happen if I OC with rivaTuner: does anyone know if both cards are going to be overclocked or just one?
Cheers,
JI -
-
Sorry, i can´t help you about SLI...
From my experience, you should begin by overclocking the memory or the shaders....run a few loops of 3DMark2006...if it doesnt freezes or starts displaying artifacts, raise the bar, run more 3dmark loops...once things start going wrong ease your clocks a little bit...then move to the shaders (if you started with the memory, and make sure that you leave the memory clocks overclocked) and repeat the whole thing...then, at last, overclock the core clock. Usually you should start with the core, but if you´re planning playing crysis, this is the best way because crysis is a shader-intensive game. It´s better to have the shader and memory clocks higher even if you have to lower the core clock, -
I've been posting on rivaTuner forums asking for SLI help but all they can say is "rivaTuner" doesn't support laptops ( http://forums.guru3d.com/showthread.php?t=248722). -
it does support it.... but the programmers for RivaTuner does not want to support it themselves. Thats all.
Many of the Sager 9260/9261 owners with SLI 8700M GT has OC'ed it fine and use RivaTuner to monitor the cards' temps. -
It sounded strange indeed: RivaTuner comes with an additional tool for the XPS M1730 logitech LCD; as far as I know XPS M1730 is the only machine with an embedded additional logitech lcd, and it comes ONLY with DUAL 8700m 256MB each.
So, if they bothered implementing a tool for the logitech lcd specifically for the 1730, why in the hell Rivatuner is not working with the line dual 8700m monitoring the right value for the 2 GPUs temp?
Is anyone able to tell if the GPUs temperature are increasing consistently (always the same) on Sager or they may differ from each other?
I am mainly trying to understand here before OCing if it is normal to see always the same temp for both.
Thanks!
JI -
if you go to the Sager forum, you can search for the 8700M SLI users that use RivaTuner and OC.
As for temps, they should be correct, read this for my GPU Temp rule of thumb:
http://forum.notebookreview.com/showthread.php?t=81852 -
Tried 169.28 drivers yesterday on my 8700M GT, fantastic drivers so far...
Noticed some performance improvements on NFSroStreet, UT3, and Crysis.
Some OC results, 760/1800/1000 at 1280x1024:
It means nothing but it´s nice to see...
-
I already used your tutorial to set up temp monitoring: NICE -
i've installed the latest drivers (169.25) and my 3dmarks are about the same (1000 points higher though), but for some reason, 3dmark06 shows Primary Device NVIDIA GeForce 8700M GT
Linked Display Adapters: false
i do have sli enabled via nvidia control and system config shows both cards enabled, but 3dmark06 doesn't see the cards as linked in the system details.
-
does anyone elses 3dmark06 show that their adapters are linked?
-
Audigy plz you can bench with default frequencies ? ( because i don't find bench on the net with 169.28
)
thx
Up ?
it's very important for me . -
I play Crysis with 1920 * 1200 resolution on my m1730 with 8700m cards (medium-high DX9), and it's sick!! I think the 8700m GT is a great card, and in SLI mod it's just as good as one 7950GTX go or even better..
-
Yeah I agree they are great cards and especially in SLI. With the new 171.16 drivers I got even more boost for my SLI setup
Now overclocked to 729/1458/900.
Nvidia 8700M-GT discussion
Discussion in 'Gaming (Software and Graphics Cards)' started by Gophn, Jun 30, 2007.