Despite not knowing much about graphic cards, here my story. Before christmas i wanted to get my hand on a new gaming laptop. Since we were too close from 2012, i decided to wait and see what will be coming for the next year. Ive read a few informations about the release of new graphic cards coming up on blog and website, 600m/7000m series. I didn't find much information on the NBR forums about them. As a counterpart, people on blog are saying they are just renamed graphic cards with a few more benefict them the previous version (500m / 6000m series).
Am taking my time right now before getting my new laptop. Is it really worth the time waiting for the next generation of graphic card?
-
the currently released versions are just renamed versions of the previous lineup, however the new cards that are about to be released 4-5 months from now (just AMD, nvidia will take a good year before we see them on mobile market) will be much better, MUCH better...
-
It will basically make todays 1000 dollar laptop look like a 800 dollar one.
-
It depends on your budget, because at the moment if you get a nVidia GT570M, they have a high level of overclocking available, which boosts them to almost GT580M levels of spec, without having to break the bank with another $400.
I can only assume if you have a big budget and want the latest card, that few months may be worth the wait. -
grat'zt : Are your talking about the 28nm technology ?
Rambisco : 200$ less can still make a great deal since i leave in the region of quebec, canada. It can lower the shipping fee + money change (USD - CAD). Im looking close at clevo resellers at the moment.
adantesuds : I like the GT570 it look a nice card, but the only laptop sellers that provide this card is MSI in my region. What i dont like about MSI is that they got a Warranty sticker void if the plate is open.... soo i cant really custom the laptop else ill loose the warranty. Still they offer a solid machine for a nice price set. -
Mechanized Menace Lost in the MYST
Yah the current 7xxx series and 6xx series are just rebaged 6xxx/5xx series cards. The good stuff is coming later this year like Graz'zt said.
-
I am in the same boat. There is a 2670cpu laptop with a 580m I could get now or should I patiently wait until later in the year for a 6xxm with perhaps even ivy bridge mobile?
-
Get them now or you will have to wait until Q2 at the very least.
-
Do you mean they will become "out of stock" and unable to get hand on one?
-
No, he means you'll be waiting for several more months. We're barely getting specific details regarding AMD's new 7xxx series for desktops, and nothing from Nvidia other than delays. The absolute soonest a 28nm product will release is Q2, but realistically it will be Q3.
-
Arggg.... I am not sure if i can wait until Q3 of the year to get a gaming laptop. I started looking since November 2011 now. When Diablo 3 and GuildWars 2 come out, i dont think i will hold anymore.
-
I want to at least see if anything looks promising for CES 2012. I could hold out for a few months if its worth it. Otherwise 580m here I come!
July/August is a looong way away (my guess for a new CPU and GPU gamer laptop). -
Looks the CES your talking about is coming next week. I took a look a the official site.
-
im not really sure but if the desktop gpus that will come earlier this year will be considered as rebadged, then i can't see any reason for doing the same to their mobile counterparts, if they will implement a new architecture in the desktop gpus this year, then 'i think' and 'maybe' their mobile counterparts will be constructed as the same. again this is just my opinion.
-
Mechanized Menace Lost in the MYST
It's already been said that they are rebaged.
AnandTech - Introducing AMD?s Radeon 7000M and NVIDIA?s GeForce 600M Mobile GPUs
Same thing for NVIDIA.
-
Well i guess it be pointless now to wait for new graphic cards...
-
Let's get this straight:
The only rebadged AMD cards are the low end ones, and this is nothing new, as it happens every cycle.
Also, AMD's 7900M (true 28nm tech) should be out pretty soon, as the cards from which they are based (desktop 7800 series) are being released in February. A safe guesstimate is March/April.
Check the chart:
-
yepp 28nm, nvidia will be a little slower in 28nm but once they are in the market it will be crazy, also I read a lot that Maxwell will be a breakthrough in GPU market, that is why I bought my laptop last semester, I will wait until Maxwell for next laptop (well, hopefully
hardware greed
)
-
good decision, however if you are waiting for 7970m or 680m, then don't wait, 7970m has at least another 4-5 months and 680m has another year to hit the market, however if you are waiting for intel 520 ssd, ivy bridge (that will also take some time to hit the market) or better ram, then wait
-
The maxwell update will still only be incrimental, nothing crazy like 3x the performance or anything like that. Though they probably have the tech to do that currently in their pockets, there's no reason for them to release it staight away when they have the option of rolling it in incrimentally and making people pay to upgrade each new generation. You have to remember they're ultimately out there for our money, and they make more this way.
Think about it, intel has tech at least two generations ahead of our current platforms, 80 core processors and the like. Why do they not release it now? Because they make more money if they don't. The only way nvidia, intel or any other company will go all out on a new generation of chips is if they feel seriously threatened by another company. It happens every single generation, everyone says "oh the next generation will be awesome, and then it turns out to be same old like the rest, only incrimental. It's the boy who cried wolf, why should we believe them this time around? -
It's all business, and by doing this they gain way better software updates too... you would think
oh god, they have already planned what to do in 2015 if not 2016, everything is being worked on
-
Nvidia shipping Kepler to notebook manufacturers
Nvidia shipping Kepler to notebook manufacturers -
gtx 680m here we come!!!
-
April - June is going to be 3 amazing months. First of all Ivy bridge comes out. Second, AMD and Nvidia is out with their true next gen GPUs. Third, in May or June AMD comes out with Trinity APUs for notebooks. 50% better GPU performance, 25% better CPU performance compared to Llano. Forth, new revision of Optimus from Nvidia is released. Fifth, all OEMs follow with new exciting new notebooks with new designs.
And I am in the market for a new notebook replacement...
OMG -
man, I don't want to discourage you, but they will be crappy expensive at first
still this doesn't mean I won't get a 680m
man we are burning money on this
-
Machine with basic quad core and GTX 680M will be ~$2.4k, just like every generation. Same machine with 7970M will be $2k, like every other generation.
-
I`m hungry, hungry for new tech. My current lap is worn out and I`m sick of it. And money is no issue here. But you`re right, money is going to fly
But I think maybe GTX 670M/675M could be enough for me. Perhaps even 660M although I doubt it. Anyone want to guess the performance gain over the 560M (vs 660M)? -
I can only state what I've heard, while mixing in my own special brand of baseless speculation, and suggest you put no weight on it.
At a meeting in Taiwan, where some pertinent info on the Clevo refresh was revealed (such as support for the next GPUs), it was also stated that the GTX 660M would be nearly as fast as the stock GTX 580M. While many in the Clevo forum were wowed, I was extremely skeptical. Once it was shown that the card is only 128-bit, that sealed it for me.
But then this unconfirmed chart came out a few days ago:
Hmm... 384 shaders? Check. Way less than 100W? Check (it says 75W below the watermark, and that's at listed clocks which simply swallow the GTX 580M's).
Could GK107 = GTX 660M? It seems like most of the specs line up perfectly.
So assuming everything I've typed is true, to the 560M it's possibly close to what the 580M is now. It just has to make up for the difference in memory bandwidth, with which we haven't yet heard how Kepler deals. -
pretty sure that chart is old and extremely fake
-
Is it? My friend just posted it on the 23rd, so I thought it was new.
Then I'll go back to being just crazy, instead of crazy and misguided. Anyway, rumored to be close to the 580M, which I believe is between impossible and unlikely on a 128-bit bus. -
Now they measure 3dmark 11 with extreme profile? But for the best card to consume 300w...
-
if this chart is not fake, then I am George W. Bush... dude, do you honestly think that they will double the number of CUDA cores with the previous generation? come on, put some sense into it...
-
I only commented on the GK107 portion for a reason, and never passed it off as legit.
I thought of the chart when Cloudfire mentioned 560M vs 660M, because 384 shaders and 128-bit GDDR5 would be stupid easy at 28nm. Heck Nvidia could just port the GTX 580M over and come damn close to that 75W TDP, with optimizations. I brought that up because I just don't get this conservative expectation of the high-end mobile GPUs, which seems to pervade NBR, as neither of the manufacturers said they're going to focus on power consumption over raw performance; instead they are both going after performance per watt.
So why shouldn't one expect that power of the 660M? -
Hmm, I don`t think GTX 660 will be 128 bit. I think there is a huge misunderstanding of all the rumours that have been floating around lately. GTX 660 will be GK106 not GK107. GK106 is 256bit, and GK107 is 128bit memory controller. According to this ancient slide, GT 520 and GTX 550 Ti will be GK107 and 128bit. They are the first GPUs with Kepler to arrive. Then slightly after, GTX 660 with GK106 will arrive, with a 256bit memory controller. GTX 560 Ti were also 256 bit, so why on earth would Nvidia make the successor any worse?
I do wonder where the rumour that GTX 660 will have 128 bit started.
PS: GTX 560M is a 192 bit, I expect nothing less from 660M.
Nvidia GK106 and GK107 28nm Kepler GPUs May Arrive Soon - Softpedia -
We got the 600M memory interface details from Softpedia:
Also, Lenovo lists the upcoming 660M equipped IdeaPad Y580 as having 2GB of dedicated VRAM, so off the top we know it is either 128-bit or 256-bit. It being the latter is very doubtful, and 192-bit is completely removed from the discussion.
The timeline you posted pretty much spells out that the GT 550 Ti (GTX 560M) will be replaced by GK107, which is where the 560M's replacement will be found. The 560 Ti successor is GK106, and maintains 256-bit.
Dropping from 192-bit to 128-bit means nothing in and of itself, without us knowing how much higher the core count and clocks are going to be.
I'm going on record to say that the 660M will beat the 560M by 30 to 40 percent, easily. One should expect each card to do at least that to the 500M it is replacing. There's going to be so much more headroom, and Nvidia has never played it conservatively. -
man you are optimistic
I do hope it will be like that, gtx 680m here we come
still I think it will be 20% increase, did we even have a 30% increase from the previous generation in history?
btw why are we looking to desktop architectures? -
Maybe there are two versions like it is with 560M? I`ve seen benchmarks with 128bit 560M vs 192 bit 560M and it aint pretty...
If not then what the hell?! What is their problem with including a 192-bit controller? They did it with 560M, why not 660M? If 660M isn`t 560M successor, then 670M will be it? If their naming system isn`t confusing enough?! Are you suggesting that they increase the clock frequency instead to "make up for it"? How is that true to their commitment to reducing heat for notebooks etc? -
Well that's exactly what nvidia did with the jump from the 360m to the 460m.
-
On a second note, the next notebook of Asus G-series will have 670M, not the 660M like they usually do. Perhaps there is a reason...
-
The 300 series was never a real series in the first place. It was just a line of rebrands, going to the extent of the 200 series outperforming it. It was the same with the 100 series.
-
1. How about the increases between 280M -> 480M -> 580M? Or 3870M -> 4870M -> 5870M?
2. We look at desktop cards, because they are eventually downclocked and turned into notebook cards.
I'm just saying that going from 192-bit to 128-bit isn't automatically to be viewed as a huge downgrade.
I mean, would you choose the GTX 560M's 192 core, 192-bit, 775/1550/625 over a completely hypothetical 660M's 336 core, 128-bit, 850/1700/900? -
oh NOOOO!!! gtx 580m is NOT a downclocked gtx 580!!!!!!! it is a downclocked gtx 560...
-
Ti...
Man all this talk make me want them to come out already so i could buy them no matter the cost. -
thanks for the correction
ME TOO
I am so dying to lay my hands on a gtx 680m
-
Am I the only one hoping for a better Optimus solution that works on non-embeded GPU?
-
I totally wasn't saying that. I was just explaining why we're discussing desktop chips in general.
Well Nvidia is revealing Optimus 2.0 soon.
Clevo has already confirmed it will be implemented in its high-end machines, and if they've finally jumped on the bandwagon, something must be improved. -
I can't afford to upgrade to a new laptop any time soon, at least not before I graduate from uni and get a job (2013). Easier said than done! So my little old 5650 is going to have to get me through another 1-2 years with my native res of 1080p - something tells me it won't be a pretty sight. But by that time I anticipate the 8000/700 series to be well established. Or even 9000/800???
-
So the question is: will the next top mobile GPU from Nvidia be a downclocked 570 or 580 GTX?
-
I don`t know how this all fit together, but what I do like is less heat with the same performance. If that 192-bit controller could help push down the core and shader speed instead of a 128-bit with higher clocks, it would have been great. Say you have two models of a GPU with same performance:
GPU #1: 128-bit, 300 cores, 900/1500
GPU #2: 192-bit, 300 cores, 700/1450
I don`t even know if this is how it works, but I would pick whatever scores the best with watt/performance or TDP/performance -
Here is something really interesting:
Here is the first screenshot of Nvidia GT 650M. It is probably Kepler, and not a rebrand like the lower 600 series.
Then we have Nvidia GT 630M, which is a Fermi GPU. Pay attention to the clocks and compare with GT 650M...
Quote from the article:
Samsung Laptops Show New NVIDIA 600 Series Mobile GPUs - Bright Side Of News*
Concerning next series of graphic card coming up (600m/7000m series)
Discussion in 'Gaming (Software and Graphics Cards)' started by KaWiCH, Jan 7, 2012.