From what I've seen, it looks like a lot of notebooks are going to be refreshed with the 860M, but prices seem like they're going to hop up about 200-$300, aside from possibly Clevo or high-end models.
It seems like the price for 770M models is down a lot, I assumed to start clearing out inventory. I got a good deal today on a 750M asus; I figure next month the only option might be the 860M, for $300 more. I'd rather get something cheap and on sale now, and get a more serious laptop towards November.
Does anybody disagree?
Will the 860M launch at a similar price to the current 750M/760M?
-
-
Maybe in the very beginning maxwell based laptops going to be more expensive, but the manufacturing process is cheaper than the old keplers. I would not buy any laptop with the old architecture, they heat much and making lots of noise. Maxwell is going to make huge difference for laptops in positive way...
Cloudfire and Robbo99999 like this. -
-
Robbo99999 Notebook Prophet
-
I would be surprised if there was not a premium for the performance boost. On $1500+ models, I would expect *a bit less* of a price difference, but on those the premium is already pretty much factored in, and I don't think they move many of those machines anyways, so there would be less inventory to clear.
As far as heat goes, your fan is still going to be blowing a lot to cool down that i7 4700, and until they actually develop/release slimmer laptops, we're going to have the same chassis on everything - that is, same size and weight.
Of course, this is all my speculation.
How long did it take for the 700M series to completely replace the 600M series in the market? -
Ps : the g56jr got an gtx 760m and not a 860m -
Well of course the old technology (Kepler) will see a price reduction to sell out old inventory.
But speaking of GTX 860M, I dont think it will cost much more than GTX 770M. We must remember that GTX 770M is a GK106, which is 221mm^2 big. GTX 860M will be GM107 that is 148mm^2. So less silicon involved to build the GTX 860M. 28nm production is also cheaper now than when GTX 770M was produced.
I`d say sasuke could be spot on. Maybe $50 more or something.
@StealthNinja: There is a huge difference between N56 and G56 notebooks. N series is the cheap mainstream series, while G-series is gaming notebooks. You can`t compare them like thatsasuke256 likes this. -
I just saw Hutsady say in another thread that the 800M will drop in as replacements at the same price, but the old series will be discounted. Which is great, you two are probably right in that aspect.
And I understand more chips per wafer reduces costs - eventually. All I'm seeing here is hate on the 700-series - but I think if you want to pinch pennies and get a decent gaming laptop - or if you want something immediately, taking a 700-series on sale would be a good decision.
I guess we won't know for sure until we see how, say, a 760M compares to an 840M/850M, since I would expect the 850M-ish cards to fall in the current 760M price range. -
Actually, I meant to add - some of you might know better from sources/experience, which I would be curious to see or hear about.
-
The new king;
Wow, awesome:thumbsup:.reborn2003 and Cloudfire like this. -
-
NVIDIA GeForce GTX 750 Ti 2 GB Review | techPowerUp
but the perfs are too , "meh meh" -
Although, it seems almost all of the news about Maxwell is geared toward "low end graphics cards," as opposed to the higher end. So, may we not see such gains on the more powerful cards?
-
Robbo99999 Notebook Prophet
Wow, Maxwell is officially amazingly ridiculously good, I just can't believe it, the official reviews are out, here's one from Toms Hardware:
Average Performance And Performance Per Watt - GeForce GTX 750 Ti Review: Maxwell Adds Performance Using Less Power
They've worked out based on 5 games that Maxwell offer 91% more performance per Watt than Kepler when comparing GTX 750 with GTX 650ti Boost. Using the information in their article, I worked out that if you compare the same efficiency but with the GTX 650ti (non boost version), then Maxwell is 125% more performance per Watt. So, depending which card you use to base your comparisons you will get different levels of performance per Watt increase when comparing Maxwell with Kepler. Suffice to say that it's somewhere between 91% and 125% increase in performance per Watt anyway. Amazing that they've managed to get this achieved on a 28nm process, without having to resort to shrinking the die. A lot of it seems to be down to increased efficiency of the architecture in terms of efficiencies of communication between the relative parts of the GPU (and also within the core itself), which they say in that article results in the CUDA cores being utilised more efficiently, they don't sit around so much 'twiddling their thumbs'! (Note: not related to when you see 100% GPU usage in GPUz).
All this really begs the question of whether they even NEED to shrink down to 20nm - I don't think they do! Certainly they won't be pressured to do so - I mean what the hell is AMD going to do about this - they must be sh*tting their pants! I still can't believe it!! Don't buy a Kepler GPU...PLEASE..for the love of God!!!!
(Cloudfire - the info in the leaks you posted must have been right! I just couldn't believe it!) Haha!Cloudfire likes this. -
-
Robbo99999 Notebook Prophet
(I quoted 91% to 125% - not the "125% or more" that you mentioned).Cloudfire likes this. -
I was referring to the article I linked above. They said up to 135% performance... It's just hard to believe, that's all. Especially since they're still 28nm chips. But I suppose you're right... I'm just tired of waiting. I want to see some game-play videos already.
reborn2003 likes this. -
Robbo99999 Notebook Prophet
(Ah, ok, understand where you're coming from when you mention 135% now!)
-
Wow, the perf/watt is amazing considering the process node. Very impressive.
Cloudfire likes this. -
The 750 Ti is remarkable. It consumes less than a 770M while slightly outperfoming it, and being a desktop component.
Cloudfire and Robbo99999 like this. -
-
Killerinstinct Notebook Evangelist
Sent from my LG-D800 using TapatalkRobbo99999 likes this. -
Robbo99999 likes this.
-
What have Cloudfire learned today that he finds very interesting:
Nvidia is concentrating on mobile GPUs with Maxwell.
GTX 750 Ti power consumption is 52W while gaming and GTX 650 Ti Boost is 104W. Nvidia have basically reduced the power draw in half from Kepler. Amazing.
GTX 750 Ti temperatures is much cooler than GTX 650 Ti Boost. Just about the coolest Guru3D have ever gotten from a GPU. A world of difference between Maxwell and Kepler.
Maxwell will be kickass on notebooks this year. Can`t wait to see what we getRobbo99999 and StealthNinja007 like this. -
-
Robbo99999 Notebook Prophet
Just want to put in a random post enthusing that Maxwell is going to be awesome for notebooks!!!!
20nm Maxwell is going to be even more stellar! AMD can really put the Mantle where the sun don't shine, but I do respect the optimisation on the software front, but really with the hardware - I mean come on, what's AMD going to be doing about Maxwell, they're going to be having nightmares (waking up in cold sweats!) for the next 2yrs!!!!!
(P.S. I still love Kepler & my GPU for what it can offer, but if you're thinking about buying a new notebook wait for Maxwell if you can afford to wait!)Cloudfire likes this. -
Hawaii got nothing on Maxwell thats for sure
Not sure if people noticed Atom Ant`s post. 50% more efficient than the rest lol
Maybe AMD knew Maxwell was gonna be so efficient and thats one of the reasons Mantle was initiated?
One thing for sure, they better have a wild card in their sleeves because mobile Maxwell will be epic. We could easily have put that GTX 750 Ti in our notebooks. It have lower TDP than GTX 765M for sakes, even when running with desktop voltage and everything. I`m inclined to say that with lower voltage and binned chips, we can get almost 3x the cores you see on 750 Ti on a mobile chip with TDP 100W or less. On 28nm.Robbo99999 likes this. -
Robbo99999 Notebook Prophet
If NVidia are aggressive on the performance, price & marketing front of this I really don't know where AMD can even compete in any shape or form. If NVidia don't really milk Maxwell, then I can see them completely driving AMD out of business unless AMD have some major ideas coming within the next year. Maybe NVidia are kind of altruistic and will milk Maxwell slowly, thereby maximising their profits while at the same time not completely driving AMD into insignificance!! I guess if I was working for NVidia, then I would like to take the altruistic path - offer a great product, make a ton of cash out of it, but not be too aggressive & give AMD some breathing room to compete (which allows NVidia to milk Maxwell for longer).
Given the significance of the performance increase per Watt just based on the architecture change to Maxwell, I can see NVidia gradually milking this & gradually releasing ever faster chips over the next 2 yrs on this, when I bet they could easily release a Monster right now! We will see! (shudder to think what 20nm could offer if they try & push for an early release on that - my bet is that they won't push for that!). (Unless it becomes cheaper to manufacture on 20nm - but I don't know when that will happen). -
God the Nvidia dik-riding here is starting to get ridickulous.
-
Robbo99999 Notebook Prophet
ThePerfectStorm likes this. -
I was equally excited back when AMD had impressive ppw with the 4670 desktop card..ThePerfectStorm likes this. -
ThePerfectStorm likes this.
-
-
I agree, fanboism is a bit odd for video cards. I'll take whatever offers the best bang for my buck and overall best stability and support (especially in way of drivers). As of late it's been nVidia. I just hope that AMD has a solid answer to Maxwell.
hfm, 3g6 and ThePerfectStorm like this. -
Robbo99999 Notebook Prophet
(I don't think there's any fanboism in this thread, not from me anyway, I''m just enthusiastic about what NVidia have achieved, and wondering where on earth that leaves AMD. If AMD had done something equally impressive then I would acknowledge that too.)
HTWingNut, hfm and ThePerfectStorm like this. -
I've only ever owned NVIDIA products since joining this community in 2012. Before that, I never owned a computer specifically for gaming. (Built gaming desktops, but never owned one.) So, I suppose you can say I'm an "NVIDIA fanboy," but that doesn't mean I wouldn't be willing to purchase AMD products, if they'd just get proper driver support.
On to another unrelated topic to fanboyism: Has anyone else read about the new GTX Titan Black? I find it to be a little ridiculous. They re-branded a 780TI, overclocked with double the vRAM, and now expect people to pay another $400? I suppose they want to keep the Titan around for gaming, huh? A lot of people I know think of the Titan as a developer's card, whereas the 780/780TI would be the ideal gamers card. Whatever floats your boat...
There's too much re-branding going around these days. -
It's not a rebranded 780ti though, it's a fully enabled gk110 with better DP compute performance.
-
It's not? I swear the page I read about it said it was, and all of the specs looked the same except the vRAM. Maybe I compared the wrong GPU's. According to this article: Nvidia GeForce GTX Titan Black Has Fully-enabled GK110 Core, "... gaming results would be almost identical to that of the GTX 780 Ti."
-
From the article you linked: "Titan cards carry double the amount of graphics memory, and have a much higher double-point precision performance"
It's not very useful for gamers, but it does benefit certain times of scientific computing a lot. But yeah, the 780ti is a much better value for games and has pretty much the same gaming performance (before you run into the vram wall). -
dumitrumitu24 Notebook Evangelist
when are the first maxwell cards for notebook coming???or this year we just get a rebadge of previous versions?
-
Beamed from my G2 Tricorder -
Just go through the latest drivers from notebook OEMs and you will get a clue that they are very near.
LaptopVideo2Go: NVIDIA & Laptop News
FYI:
Gigabyte only have 700 series listed
Clevo have all listed, GT 840M, GTX 850M, GTX 860M, 870M, 880M.
Asus too
HP have GT 840M > GTX 860M which makes sense because they do not offer the high end
MSI not mentioned yet
GT 840M, GTX 850M and GTX 860M are the only Maxwell cards this round -
Top end envy perhaps? -
HP Envy 15 is def getting GT 840M. Read here HP mobile driver 332.33 - News - LaptopVideo2Go Forums
I don`t know what gets 860M but that GPU will have TDP of around 40W maybe so it can be used in just about anything -
-
GTX 850M will def have DDR3. Source.
But like all Nvidia GPUs, there might some company who make a GTX 850M with GDDR5 too.
I still think GTX 850M will have 512 cores while GTX 860M will have 640 cores (like GTX 750 and GTX 750 Ti), but Nvidia can go the easy way and separate them by just using DDR3 on one and GDDR5 on the other while both have the same core count. -
-
-
DDR5: 2800
it shows 20% in difference in 3dmark 11, but in reallife its more closer to 30-40%.
Every card with DDR3 is a waste of money to me, since adding up GDDR5 is always a huge boost. More effective then higher clocks or more shaders (because it lacks memory bandwidth)
I have a 5830 Mobility, it has 800 shaders (not comparable, just sayin) and has DDR3 @ 1600 mhz, when I overclock the memory 20%, it translates in almost exactly 20% performance increase, if i did this with upping the core frequency, i maybe should get 0-5%. So never DDR3 for me again, if you want to game in fullHD the difference is even higher between the two.3g6 likes this. -
I wonder than why are they primarily build 850M with DDR3? is that so much cheaper than GDDR5? Any info if 840M also based on GM107 or the smaller GM108?
New details about Nvidia`s Maxwell
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Feb 12, 2014.