Eurocom Spokesman is back Lol..
They're running a survey on their website
"When it comes to having control or a computer at the firmware level, which statement best describes your position?"
Be sure to vote who knows how far it goes.![]()
Wishing everyone happy holidays
@Prema a.k.a bad Santa not to late to deliver gifts...will take them unwrapped
![]()
-
Exactly what I was thinking!
Sent from my LG-H901 using Tapatalktemp1147462323, ole!!!, hmscott and 1 other person like this. -
-
http://www.notebookcheck.net/Schenker-XMG-U726-Clevo-P870DM-Notebook-Review.153136.0.html
There are lots of photos of the keyboard area in this review, should give you everything you need!hmscott likes this. -
So I was browsing the lastest 3d Mark scores after busting 17088 to see @Mr. Fox throwing down 17578 !!!! = http://www.3dmark.com/fs/6888349
-
Be sure to check out the 3DMark 11 results while you're looking. The worst thing about Fire Strike is the sucky OS that needs to be running to take away a top score.
One of these 980M cards has a horrible ASIC score, so good benchmark runs at high overclocks are few and far between. The other GPU overclocks extremely well. The bad one allows me one or two tries on a cold boot after reaching room temperature, then all the fun is over. -
Oh no w10 lol
Sent from my LG-H901 using Tapatalk -
Yeah. Micro$loth just doesn't get it. Or, maybe they do and we are seriously outnumber by people that like a crappy-looking UI, malware, bloatware and huge CPU performance downgrades.
If anyone feels compelled to upgrade/sidegrade/downgrade (however you view it) to W10 TH2 for Intel Speed Shift technology, don't bother burning any calories on it. It is a totally worthless gimmick as far as I can tell, and having it enabled lowers my CPU benchmark results. It might be better running stock with BIOS defaults, but I don't really care about that since I never run the CPU stock with BIOS defaults. I didn't even bother checking. As soon as I saw wPrime, Cinebench and 3DMark 11 Physics scores all took a nose dive, it was changed back to disabled in the BIOS.
Merry Christmas, everyone. I hope that all of you, and your loved ones, have an amazing holiday. -
What is the ASIC on the other card? Both of mine are around 80% and I managed pull a 23.1k graphic once before my single adapter tired out.
CPU wise, I noticed the voltage move way too much using xtu. With prema, is there a way to lock the voltage at a constant?TomJGX likes this. -
No, not 100%. You can control it to some degree, but no-load voltage is always high. That's actually not a bad thing because it helps avoid crashing and BSOD that is common with idle voltage dropping too low or not increasing fast enough. I think the high idle voltages is actually a huge blessing with Skylake and wish the Panther behaved the same way. When you are adjust voltage, just bearing in mind that what you are changing is the voltage the CPU receives under load.
I will be downgrading to single GTX 980 later today for final firmware testing. Need to make sure the last couple of mods for SLI did not hurt GTX 980 performance or behavior.
Yeah... very sad about the one GPU. The low ASIC score is very detrimental to extreme overclocking abilities. If I buy one of these machines the company I buy it from will have to agree in advance to test ASIC score and promise not to send me any GPU with less than a 75% ASIC rating as a condition of purchase.
DreDre, temp1147462323, jaybee83 and 3 others like this. -
Meaker@Sager Company Representative
My 70% G200 chip makes a good showing of itself against the desktop crowd, I know it's water cooled but even on the stock cooler it was not bad. You can never be sure with ASIC quality.
ajc9988 likes this. -
G200?
-
Meaker@Sager Company Representative
-
Was that was after the modding?
-
Meaker@Sager Company Representative
The power modding allowed it to go a little further yes but not a huge amount.
-
Most high ASIC cards can't even take close to 1.2v and crash much earlier than the ones with lower scores.
Black-out issues on the other hand are not core but supply issues induced by weak/missing MOSFETs as well as in some cases simply Motherboard limitations.
Happy Christmas Everyone!
(And no, as much as I want, I simply can't release unfinished stuff, so please don't ask.)Last edited: Dec 24, 2015ajc9988, CaerCadarn, Papusan and 8 others like this. -
Gonna give the 361.43 driver another try. Let see what horrible instabilities it will cause.......
I am OCD about the new driver alert in geforce experience. -
It's totally cool man you need to take all the time you need to release a good finished product,I'm very sorry if i offended you i was just poking some fun your way.Happy Christmas to you aswell.Prema likes this.
-
You didn't offend at all...just didn't want my Christmas wishes to be interpreted as some sort of "hint"...they are just Christmas wishes, as pure as they come with no strings attached.
ajc9988, hmscott, Mr. Fox and 1 other person like this. -
Yep, 361.43 doesnt like prema. Reflashed bios via safe mode and reverted back to 359.06.GTVEVO likes this. -
They work fine for Fox, but he now has a clean Windows installation, after it got severely messed up by cross-flashing various mods.
-
Mine crashed on driver installation and refuse to boot back to windows. What windows is fox on when he tested it?ajc9988 likes this. -
-
I could run it but overclocking was not as efficient.
Sent from my LG-H901 using Tapatalktgipier likes this. -
I am on 8.1 atm.
-
Errors, crashes and black screens for many people:
https://forums.geforce.com/default/...hql-display-driver-feedback-thread-12-21-15-/ -
Ionising_Radiation ?v = ve*ln(m0/m1)
@Mr Fox is pissed about his 69.4% ASIC on his 980M, while my GTX 860M is 69.5% and gets nearly 4600 in Fire Strike... Not sure to cry or be happy. Cry because my GPU seems so noob (although @jaybee83 may disagree...), or be happy that I can get a +350/+250 OC on that GPU. Voltage is exactly 1.15V, an increase of about +50 mV above stock.
I wouldn't mind getting a 980M, even if it had a 1% ASIC.
And finally, Merry Christmas and a Happy New Year to one and all. -
I actually found it overclocked better under W7 and W10, but as Prema said, this was the first and only GeForce driver installed. I also am not allowing GeForce Experience to be installed, just drivers. NVIDIA may have introduced some crapbook features with power management and "hybrid SLI" that I have never noticed before. That might be causing issues for some people. I saw new stuff in the Windows Power Configuration tool that I have not noticed before and it disappeared with SLI enabled.
The biggest problem is imbalanced ASIC scores in an SLI system. GeForce drivers seem to have started doing stupid stuff based on ASIC scores earlier this year, (another crappy new "feature" introduced in 2015 by the nvidiots working for the Jolly Green Giant,) and it's more evident in SLI than with single GPU. The GPU with 74.6% ASIC will overclock higher on core with significantly lower voltage, so the GPU with the lower ASIC score messes up the better one. Being able to overclock higher using less voltage is always better for temps and more stable. -
This is my 1st Benchmark for my laptop
i would like to share it but i think this laptop can do much better
http://www.3dmark.com/3dm11/10714199 -
I would imagine it can be done via different voltages for different gpus, but it will be a major PITA because you have to stress test each GPU separately.ajc9988 likes this. -
sure it can do better, but this is a solid baseline for stock clocks
Mr. Fox likes this. -
Yes im really impressed with these results any way my old gaming laptop score P1220 :X
http://www.3dmark.com/3dm11/10714237 -
to put that in perspective: im barely scratching 15k on 3dm11 with my config oced on cpu, gpu and ram
thats like 7% above your stock
u can calculate with around 15-30% more performance when overclocked, so ill just be a dust cloud in your rearview mirror
Sent from my Nexus 5 using TapatalkFasoole likes this. -
Don't complain about horrible ASIC... Mine has a REALLY low quality.... 67,2....
-
Meaker@Sager Company Representative
-
This simple functionality test run was on a 67% ASIC card, which was unable to do the same with 1400Mhz and 1.2v:
https://www.techinferno.com/index.p...s-the-official-thread/&page=17#comment-134176
From our testing:
Higher ASIC = Great results with very little voltage, but driver crashes from a certain voltage onwards.
Lower ASIC = Scales nicely with more voltage as long as cooling is sufficient (higher voltage needs all MOSFETs, in order to to supply sufficient power and not black-out under full load, )Last edited: Dec 25, 2015ajc9988, LoneSyndal, CaerCadarn and 5 others like this. -
Dang should really go a bit into OCing... Don't really understand how it works with the voltages and frequenzies...
-
Spartan@HIDevolution Company Representative
Here is the Service Manual buddy:
http://www.mysn.de/driver/XMG/XMG_U726/Service_Manual/XMG_U726_P870DMG.zipNick Rapoport and bsch3r like this. -
MSIGT80/AW18 --> high asic
CLEVO w/ dual PSU --> low asic -
So, the question that hasn't been ask is...Can this boot and run intel graphics?
ajc9988 likes this. -
I dont think so. From what I was told, the HD graphic is not even connected. Correct me if I am wrong though.
-
Spartan@HIDevolution Company Representative
You're right and thank God for that. Hate dealing with Optimus crap -
Booting to Intel does not necessarily mean optimus,can have a mux like the newer BGA clevos.For gaming maybe totally useless, but for virtualization Intel is cheaper and has a head start in consumer soft-passthrough (gvt-g) to get an nvidia to do that you need a quadro or some nasty kvm/qemu hacks.
Edit - i really don't know much about a mux (well i know some)but i mean from a performance point of view.Don't know if it interferes with stuff that can cause problems,but i do know if running a mux nvidia would be absolutely like running it directly i would prefer have that as an option.Would help with blind flashing and other cases where your dedicated cards are indisposed.
I'd like experimenting with a bios flag to turn it on/off though to see if i could still get xen to render onto it and dump it onto nvidia like a reverse optimus.Last edited: Dec 25, 2015 -
Then that is all bad.
-
Spartan@HIDevolution Company Representative
Ah if they can make it a separate boot like the Alienware 18 then its OK just not both GPUs together -
Yep, I remember my broken ass 880m and optimus.....
I definitely have no love for optimus.Spartan@HIDevolution likes this. -
Mine was supposed to come yesterday but FedEx failed somehow so now it's coming Monday.
TomJGX likes this.
*** Official Clevo P870DM/Sager NP9870-G Owner's Lounge - Phoenix has arisen! ***
Discussion in 'Sager/Clevo Reviews & Owners' Lounges' started by NordicRaven, Sep 22, 2015.