MSI has the fastest stock clocks:
![]()
Thanks to Cloudfire
MSI GT60/70 with GTX 680M and Optimus - Page 6
-
But bus interface is only 2.0. This is Clevo's bus interface
:
-
i can´t remember any advertisement for the 680M with turboboost
indeed -
Edit: oh, nevermind, I misunderstood what you meant.
Anyway - show off
-
I think its because of the motherboard the card has been tested in
-
I'm sorry. I read your post as if the 680m doesn't go above 718MHz at all.
Yes. -
If its the motherboard's limitation then it should show PCI-E 3.0 x 16 @ x16 2.0 intead of PCI-E 2.0 x 16 @ x16 2.0. It's very possible that the MSI card's maximum bus interface is only 2.0.
-
Could just be GPU-Z:
GPU-Z - Project Search -
Hi,
did anyone tryed to overclock this way:
[GUIDE] Nvidia Inspector GTX670/680 - disable boost / fixed clock speed / undervolting
Maybe it allows you to pass the +135MHz limit? -
Too bad the 2.0 16 lanes interface which the MSI have is overkill for the 680M anyways.
Oh well, its like the 4GB vs 2GB 680M. Although you don`t actually need the extra 2GB you can still brag about it
-
So supposedly Ups is saying I'll have my notebook by tomorrow, I wouldn't mind posting 3D frame rates if anyones interested, and other things as well of course
.
-
Meaker@Sager Company Representative
Well pcie 3.0 could help with optimus.
-
Yo hackness, have you tried undervolting the GPU?
-
Nope, by the way I think the card might have died
. Today when I turn on the laptop the fan went on full blast, and keeps beeping, still booted into windows tho, but the beeping still continues
. Checked GTX680M with GPU-Z, everything is 0 except the memory became readable 4096mb (was always 0 before the card failed
)
. And then after that the laptop shuts down itself, tried many times all have the same result, switched back to GTX675M and fan back to normal again. That GTX680M must have died. Well I've contacted the seller about this, should get an answer soon
.
-
How so?
-
No way???!!! How did it die? So sad to hear!
-
i was going to say the same thing.
humm, i would quite running furmark if i were you. -
oh man...i hope the seller cover you up.
-
Meaker@Sager Company Representative
My card has shipped
Since every frame has to be transmitted from the GFX card to the integrated frame buffer before being displayed it puts extra stress on the PCI-E bus.
This was the cause of 3D and SLI not working with optimus since they either require more frames per second or already load the bus more. -
well, you got me there. never ever heard that in the history of on board graphics since they can never reach the pcie bandwidth mark of pcie 2.0 or higher.
but you learn something new every day. -
Also I wonder how the other top scored GTX 680Ms fare right now since they either overclocked or overvolted heavily. -
I don't run Furmark on the GTX680M as it throttles, but the night before I was playing Witcher 2 on stock clock tho.
-
you already did run it. and it's known to kill nvidia cards. throttle or no throttle. but lets just hope that's not the case and it's something minor.
-
So Furmark is a no go for people with Nvidia GPUs?
-
that would be correct. you wont catch me running it anytime soon. i burned up a set of cards just running on stock, but apparently i ran the test to many times. go figure. that was my first set of 580m's. a few others ran into to same problems....im sure it's hit an miss, but im cool on that test.(personal opinion only)
-
Are you saying that one furmark run killed the card? lol. And that's like 4 days ago.
-
Are you sure you were hearing your GPU fan? Mine seized-up one day and PC blacked-out. It was an intermittent problem, and sometimes the PC would be off when I came back from a break. I had to lube the fan to get the thing out of limp mode...
-
They were both running, blowing out warm air and not hot yet. Also after I changed back to GTX675M fan was acting in a normal way, that GTX680M is dead.
-
dang thats sad to hear. Did you abused it that much? overvolted?
-
Nope, except that I decided to game on stock clock on the witcher 2 for roughly 6 hours the day before it died.
-
That's what she said.
-
Sorry to hear that hackness. I hope you can get your card replaced. So you're saying the 675m that was in there is working? Or did you sell that? Can at least use it with HD 4000 right?
I was running some benches with vRAM at 2300MHz and is a bit unstable in some instances. 2280MHz seems to work perfectly fine though.
I find it hard to believe Furmark would kill a card. The system has failsafes to throttle it in case of too high temps or other parameters. Although I never use Furmark, I use MSI Kombustor DX11 to check stability. Furmark is OpenGL if that makes a difference. In either case I don't see it, it's likely not Furmark, but something wrong with the hardware. -
maybe its the modded bios? did you go back to stock bios?
-
Yes I switched back to stock right after I found the Memory controller load issue.
-
from watching this thread it looks like the stock bios gives more performance per clock than a modded bios which allows for more core clock. Is this correct?
-
That's correct, svl7 and me noticed that the scores are lower than expected when using modded vbios. If you put stock vbios at the same clock as the modded vbios, the stock vbios will score at least 250 points more on GPU score when running 3DMark11. svl7 has contacted Saltius and they found that Luoxia's vbios is acting weird too, low performance per MHz. So at the moment the modded vbios isn't really worth flashing to unless a new way of editing is found.
-
no. i am saying one furmark run can damage the voltage regulators to the card. is all im saying. and that has nothing to do with heat or throttling.
but since you have been benching the card, you basically fried it.
. with or without furmark. 1 run.
-
I see furmark as a no go for every card.
My 680M is in the local Parcelforce depo so I will get it on Saturday or Monday. I won't bother overclocking or running furmark, I'm boring like that. -
They think your normal or am I normal?
-
what have you been smoking bro?
-
The system of hackness and your system did not perform as expected with the modified vbios, both of them.
-
Thank replies
-
The GTX680M would Turbo?
Attached Files:
-
-
I don't have a 680m, but other users report that it does not boost. Not even with the Dell 680m, even though it displays boost clocks in the GPU-Z main window.
-
Yes the GTX675M is working right now
Today the seller contacted me and requested me to send screenshots, hopefully a RMA form will be issued
-
Where did you buy it from and how long is the warranty?
-
You`re not alone with that dude. I won`t overclock my GPU either, but I can understand that some may want to test out Furmark to see how far they can push the GPU. But on the other side, the burn in test with Furmark does not represent any future scenario so I`m not shure why people take the risk IF it can mess up the GPU.
Let us know when you get your GPU.
-
Well I guess it's ok.. some play safe, some don't have unlimited money and others.. yea they can do w/e they want
-
build it, play it, break it, repeat.
-
Life of an engineer.
P150EM Upgrading from 675M to 680M!
Discussion in 'Sager and Clevo' started by hackness, Jun 28, 2012.
