2560x1600 is basically the highest supported resolution because there was and still is for the time being nothing with a higher resolution.
If something bigger joins the game, i think it wont be that hard to get a driver that will support the new resolution.
Regarding the infamous nvlddmkm.sys error, if you google about it for a bit, you see that its something that has plagued nvidia cards since the times of geforce 7000 upt to 8000 9000 and up to present days.
Thats why i am inclined to go towards ati.
And what i would really want to see is a 20" laptop with double 1920x1200 rez (thats a 3840x2400 resolution) dual gpus, lighted keyboard, 4 ram slots or more, and 4 hdd slots plus cdrom slot.
One more thing to note, if apple gets the 2880x1800 (thats double of 1440x900) display to market, more manufacturers will probably follow.
As of now, if were to buy a new laptop, the msi with 580m and dynaudio system is the best solution, because has lighted keyboard, and a ton of other goodies. Perhaps alienware m18xto.
But what i really would want, is dual GPUs with some sort of optimus tech, and a low power GPU for use when not gaming(because i find myself windowsing a lot of time), togther with 4 ram slots and lighted keyboard.
AND of course bigger than 1080p display.
And i suppose there is gonna be sometime before we see that from clevo, or any other manufacturer for that matter.
But as things stand out i preffer the dynaudio equipped msi.
-
the m18x offers graphics switching with their amd cards doesn't it?.. i know it doesn't have higher then 1080p, but it meets your other requirements
-
But how can i bring myself to get one when apple is about to throw at the world the 2880x1800 display, and more then likely the world will follow.
Not to mention the fact, that i believe, in the m18x only one port is sata 6gbps, and that restricts the building of a 2TB array from 2 1tb ocz octane drives.
There is a good chance also that the m18x R2 will get a bigger than 1080p display. -
Will the refresh use the same fingerprint reader? If so, they're idiots
.
-
i suppose so.. why are they idiots?
-
The fingerprint reader is really poorly designed.
-
How long do you anticipate the wait? I was thinking of buying the Sager NP8180 because it has dual 6990s and the i7-2960xm(which has an unlocked multiplier so I can overclock very high).
I'm guessing we won't see anything remotely new on the market with Sager until April 2012. What are your thoughts? Or anybody's thoughts.
My overall goal is to play games like Skyrim & The Witcher 2 on ultra maximum settings with anti-aliasing cranked all the way and in 120Hz 3D. Sadly, the NP8180 doesn't offer 3D. But the NP8170 doesn't offer dual GPUs either. And my priority is that unlocked processor so I prefer that as opposed to the i7-990x. -
Larry@LPC-Digital Company Representative
As far as the new mobile platform shipping you may very well be right...April 2012 is a reasonable time frame.
Any model that would replace the 8180 or 8170 would not be out till then.
The only model coming out before then is the Clevo P270WM or Sager ??.
_ -
Keep in mind that you either get 3D or CPU OC on the P170HM/3 as they messed up the board design and have no confidence to enable XTU in bios....they even overwrite the Multis via failsafe override when we hardcode them via bios mod...
To cut it short:
-P170HM has CPU OC & NO 3D
-P170HM3 has 3D and NO CPU OC -
ya this is excatly im worring about p270wm/p270wm3
-
I guess that´s part of the XTU delay... to get things right this time...
-
Though I have to wonder, why exactly are you all worried about OCing your CPUs on these machines? From what I see, Sandy Bridge (and by extension Ivy Bridge) are already plenty powerful in the mobile market, as they are in the desktop market. I don't get why 99% of you would need to OC the CPU for those machines, aside from bragging rights. In terms of practicality, they're pretty damn good already, don't you think?
I'm sure they can make SLI work with 3D somehow, but even without it, the also-upcoming 6xxM lineup is supposed to blow the 5xxM out of the water, and the 580M was already so plenty powerful it could run most any game maxed out in 3D mode (except maybe the horrendously-optimized Metro 2033). Unless I'm mistaken on that (I haven't been able to try it first hand), but I do know someone with a 560M and he hasn't OC'd it and it pretty much maxes BF3 for him, and he gets 150+ FPS on pretty much all other directX 9 games like CoD. So yeah, I'd say 3D vision with a 580M wouldn't be too hard, far less the upcoming 660/670/680M cards. What are you guys so worried about? -
For me, D2 Ultima, it's to play TESIII Morrowind with mods because the game is single-threaded and uses only up to 100% of one processor. So if I have a six-core processor at 2.0 GHz, the game would run as if I had a one-core 2.0 GHz computer. It's due to the crappy gambryo engine it is built with. Plus, the gambryo engine renders everything within field of view, even the backs of houses and rocks despite the fact that you cannot see them. That's part of why it's an awul engine. And so I need a fast processor to render everything in Morrowind because it is a CPU hog, like most Bethesda Elder Scrolls games are. Those are my reasons. I'm sure other people have different reasons.
-
But the slowest base speed is 2.1 GHz for Ivy Bridge, and according to what I know from SB/IB mobile CPUs, their speed is dependent on heat. One core being used up is FAR less likely to make your CPU run any kind of hot, so it should run near the max clocks anyway. Which ought to be what, 2.9, 3.0 Ghz? And that's only if you take the base CPU, the extreme ones (or even the replacement for the 2760QM which is what I consider best bang for buck) would probably be running well over 3GHz. Also, if that REALLY isn't enough for you, why not edit the game to use more than one core? I had to edit Thief: Gold (The "complete edition" if you will, of Thief: The Dark Project) to run using only one CPU core instead of my eight, because running it on a machine with more than one core would cause it to crash upon loading/playing any level of any kind. This was a permanent fix, one which I did not have to re-apply each time I launched the game. I'm quite sure the reverse can be done, because you can force processes in task manager to use more cores (or fewer) than they're designed to. There must be a way to permanently do it. Unfortunately I don't remember how I did Thief: Gold's method, but it should still be searchable. I'm sure there's a way to make Morrowind use two, four or eight threads permanently.
-
People are working on rewriting the engine code so that Morrowind uses more than one processor. It's really hard and time consuming from what I've heard.
And with all the landmass and graphics enhancing mods I use for Morrowind, I would need a 10Ghz processor to render everything. People with 4Ghz rigs will only get between 10 and 20 fps while playing, depending on which region of the game you're in. -
3D video editing is part of my profession and a studio quality encoding takes all night long even with my 2960XM...apart from spending 1000 bucks for an unlocked CPU that i can´t OC...
-
Uh oh. If it's unlocked, why can't it be overclocked? I thought the i7-2960xm was perfect for overclocking?
-
Clevo cancelled XTU support on 3D model because of 'problems'...before my purchase it was still available:
http://forum.notebookreview.com/sag...-how-get-1600mhz-running-xtu-not-loading.html
My Reseller (OriginPc) even gave me a free upgrade from 2920 to 2960 because of that...but no one could get hold of an XTU bios for the Clevo version...if anyone has one please PM... -
Download ThrottleStop instead. It has all overclocking options available for unlocked mobile CPUs.
-
Doesn't CUDA help? And honestly, if it takes all night long MAXING your CPU (the "Maxing" part is important) then I don't know how much difference a little OC would make... not like you could run it water cooled. an extra 3GHz might sound like a lot (3.5 max to 4.0 max, I think) but 3.5 x 8 = 28 already. 31 vs 28 isn't gonna cut off much time. Does your encoding use all of your CPU though? I've never seen an encoder of any kind use all of my i7-950 (though I can't say I've used em all)
-
The i7 can only be OCed via bios...XTU writes directly into the bios performance tuning section...that´s the only option...
It´s Mainconcept Reference, Sony Blu-Code and some other stuff...
CPU is maxed and CUDA runs on 100% for certain parts of the process...
When i placed the order it was offered with 4,8Ghz OC covered by warranty:
http://hothardware.com/News/Origin-PC-Adds-1080p-3D-Screen-And-48GHz-CPU-To-EON17S-Laptop/ -
My thoughts about about the upcoming 6 series of nvidia based on what happened in the past and the fact that DX11.1 is going to come out pretty soon, as windows 8 will probably be released next year.
When dx11 was released nvidia did not make the desktop 3 series gpus they only made a few low end mobile gpus of which most of were renamed.
If you have been following the various threads on the nvidia 6 series you will know that all of the currently known 6 series are just rebrands, therefore if my train of thought is correct, there will not be any highend 6 series mobile cards and there will be no desktop cards.
What we are waiting for is the 7 series which will be totally Gangsta and will be fully implemented Kepler(GK10X).
also take a look at this article: GTX 780 -
I think that's not going to be the case. They would not bother with a 28nm version of half a lineup for the 6xx series. Besides, if the 660M is as strong as the 580M (trade for trade) and that was all, it'd be a waste of money to make that, because the price point would replace the 560M (as it replaced the 460M, etc) for the same price as their flagship? That is extremely unlikely. If they're only planning to make Kepler for desktops then they're also taking a step backward on the mobile market again, because they skipped the architecture of their GTX 2xx desktop series (I don't remember its name, I only know G92b/Fermi/Kepler) and made Fermi for laptops too. They should make Kepler for it as well.
Anyway, another tell about that article is that the person who wrote it doesn't understand proper English. There were spelling/grammar mistakes in it, more than three. The only other article about it is written by the same person too. I say we wait for more information on it. -
Are people really worried about them rebranding the whole line? What they are doing is renumbering/ordering. So a gt555m isn't slower than a GT640m. This is done nearly every release to avoid performance confusion with consumers.
-
Wow the 7 series looks to be a HUGE improvement over the 5 series right now. 200% on half of the games over the 500s!?!?! Just outstanding.
-
i think you misunderstand what im saying, the currently known 600 series mobile gpus are the 635m(re-branded 555m 40nm) 630m(re-branded 540m 40nm) 610m(rebranded 520mx 40nm). they are all 40nm not 28nm. this is probably going to be all of them there will be no 660m and none of the mobile ones will be 28nm. they wont release 28nm till 700 series.
AMD's Radeon 7000M and Nvidia's GeForce 600M Mobile GPUs -
Yes, that might be true. But if that's the case, then it'll still come out in the same timing, just not 6xx but 7xx. Either way, I really don't care what it's called. All I'm concerned about is the performance replacing the old gen at the same price. For all we know they might come in on MXM4, to allow for far higher memory bandwidth. If I'm correct there's a limit that's being approached on the MXM3.0b boards? Oh well. Still though, it'd be really great. If I could only get a machine around Q2 or Q3 2012, I'd so love the 28nm powerhouses. I'll definitely not need to upgrade for a longggggg time. And I should have 3D as well. =).
-
View that chart with a big grain of salt. Metro: Last "Night".
-
Lol didnt see that still regardless of whether that chart is true or not i dont think there will be a full 600 series
-
Well you may be correct, but do ignore that article. I saw a similar one that was apparently leaked on AMD's 7000 series, and it showed almost the same trend of power increases. Now I don't know if it's true or not, as I believe in this same thread DGDXDG or whatever his name is gave that report from the guy who went the tech show, and he were saying AMD's new GPUs weren't fantastic, but nVidia blew them away. Now this may only be the mobile market, and the desktops may be completely fine, but it still stands to reason that nVidia's next powerhouse lineup should be out at most Q2 2012, and it should be so powerful that the current (very strong) lineup looks like nothing by comparison. AMD might pull some good stuff out of their butts too, but I'm simply not counting on it. It doesn't matter to me what AMD does though, I use CUDA and I love PhysX and I want a 3D screen with my machine, so nVidia's the only outlet for me. I just hope it's out (or announced with a release date) by the time I get the money for the new machine.
-
hey d2 i am DGDXGDG aka DXG
ofcourse amd desktop hd7900 will blow nvidia away.....
its 4.3 billion transistors............more then gf100/110 3 billion and two gf104/114s 3.9 billion
-
But if Kepler is correct, then 28nm should be a lot more than gf100's 3 billion =P
-
Full details on the cards that matter to the mobile world, the 77xx and 78xx lines.
So hopefully it should work out that:
7870 = 7970M
7850 = 7950M
7790 = 7870M
7770 = 7850M -
To me, ATI doesn't have great driver support. But the price difference is so tempting tho :/
And it's unlikely that the 8130 model will get ATI due to the Optimus implementation -
Are you talking here about the old elder scrolls Morowind game ?
If a game is such a resource hog, i would never play it.
A good coder make sure to optimiye stuff as much as possible.
A good example here: Look how good looks and how well runs God of War on play Station 2, check the play station 2 specs, THEN look how good morrowind runs (10 - 20 fps) on 4ghz rigs.
Notice any difference ?
There are plenty of other games to play that offer a far better gaming experience than these resource hogs. I say burn them ! -
Perhaps 7970m will be a modified 7870 desktop core it beeing the next small gpu in size (its uncharacteristic to use for a mobile version the biggest die you have in desktop, so they will use the next best thing - the 245mm die), perhaps with Graphic Core Next, and maybe, if we are lucky, 4GB memory per GPU.
So, a 7970M might be:
)245mm^2
)256 bit
)Slighty different clocks to fit within 100W power envelope
)4Gb GDDR5 Memory
)Graphics Core Next
I would take 2 pieces, thank you very much. -
i think the problem is that morrowind belongs to the era where multi-core cpu do not exist yet.. i maybe wrong, morrow wind was a game so long ago i dun even know if multi-core processors exist back then.. too sketchy my memory has become.. maybe P4 with hyperthreading already exist.. not too sure about it..
some info about upcoming refresh
Discussion in 'Sager and Clevo' started by DGDXGDG, Oct 21, 2011.