It was a fair point though, to me it seems that the reason that CPUs aren't increasing quickly in performance is due to the fact that they're not increasing the number of cores - I think if they could find a way to increase number of cores and retain efficiencies then that would be a good solution. I know that number of cores on CPUs doesn't scale as well as an increase in the number of cores on a GPU, but I reckon they're gonna need to find a way to increase CPU cores efficiently if we wanna see big jumps in CPU performance.
EDIT: although from what I know, the performance effect of increasing number of CPU cores is also very dependant on the software and operating system being used - can it efficiently utilise those extra cores.
-
Robbo99999 Notebook Prophet
-
-
Robbo99999 Notebook Prophet
-
-
Robbo99999 Notebook Prophet
What's the super quad design you mention? That's got nothing to do with our Increasing CPU Core Discussion right. -
It's almost September, fellas. September 18th was when the GTX 980 came out and when we heard about the 970M and 980M, which came out in October, exactly one month later. So, if NVIDIA is going to do anything, it should be within a few weeks.
TomJGX likes this. -
Robbo99999 Notebook Prophet
-
If I were NVIDIA, I would leave things as is and launch Pascal early (next year) before AMD even has a chance to blink.
But I'm not, so...speculation it is. -
-
Back in the good ol' days, NVIDIA launched the first of the 700 series desktop GPU's in Feb. 2013 - the GTX Titan. And then they announced the remaining GPU's in April, launching them in May. I hope they return to this schedule by launching the GTX 990 Fall of this year., leading up to the launch of Pascal in April/May with the announcement in Q1.
I don't like Fall releases. It conflicts with school.Last edited: Aug 19, 2015 -
What *I* want is nVidia to stop launching "flagship" midrange GPUs. But of course since people buy the flagships then get the "big dies" when they come out... more $$ to make people wait I guess. -
Intel have hold back CPU speed since 2006/no competition.
"no possible higher clock rate"
Intel desktop CPUs between 2007-2011 (pre 22nm) all could overclock to 4 to 4.5ghz. The sad fact is that Intel could have increased CPU speed each year by just upping the MHZ instead of spending billions on a TicToc approach.
This have been a really dirty tactic by Intel. Back in 2005-6 Intel had less than 50% of the server market for example (by revenue). Intel introduced 300 dollar Xeons that more or less killed SPARC, PPC, Power and all the other fun 4000 dollar CPUs. Intel killed off PA RISC by promising Itanic.
2009 Intel integrated the memory controller in the CPU and made it impossible for AMD/Nvidia making motherboards for Intel. Forcing customers to buy Intel motherboard and crappy Intel CPU. (Intel have today 70+% market share in graphics on PC)
Instead of using die shrinks/more transistors to make faster CPUs Intel continued to use that space for their graphics (trying to kill off dGPU). We as customers have no choice if we want an Intel: We have to pay for this crappy graphics.
With IrisPro intel started to charge 300 dollars extra for those CPUs. This lead to the absurd thing that highend notebooks used IrisPro and users who needed dGPU had to pay extra for the dGPU. Giving intel 300 dollars extra for IrisPro that they did not want.
Skylake is another step in intels plan. They removed PCI lanes putting them on the motherboard. We still have not seen a SLI/Crossfire test on a skylake setup. Skylake is very very bandwidth limited. (Intel wants to force users to buy Xeons/i7)
Xeons today don't have crappy graphics. Instead they have more cores! When Intel killed of the server competition they rised prices on Xeons over 300%. Today Intel charges 4400-7500 dollar for high end Xeons. The same prices that killed the RISC market, the difference today is that Intel have no competition.
Compare this to graphics AMD/Nvidia war. People complain over 650 dollars GPUs that are over 600mm2 large / 7 billion transistors.
Intels 18 core Xeon is under 500mm2. It cost Intel less to produce these 18 core Xeons than it cost AMD/Nvidia to produce high end graphics. Still Intel charges 7500 dollars for these compared to 650 AMD/Nvidia.
If Intel had competition we would have seen 8-16 cores intel under 500 dollars.
If Intel had competition they would give us users a choice. Do we want to use 50% (4core Intel) 80% (2 core intel) of the die area for graphics or should Intel instead put on 4-6 more cores?
It would cost intel ZERO extra adding 2 cores and use the power envelope freed by loosing the graphic to up the mhz 500-1000mhz.
I hope I make my point clear. Intel is abusing us.
They have year after year killed of competition and forcing users to pay for stuff they want. Intel believe that they have to do this since X86 is an inferior platform. CISC have a 30% die size penalty against RISC for the same performance. Intel therefore have to spend 10 billion/year on process to be 1-2 nodes ahead of everyone else.
Today its the server market that is saving Intel thanks to intel killing of competition and now over charging for stuff 95%+ profit margin.
Removing PCI lanes is Intels latest war against gamers. Forcing us to buy Xeons.
We really need competition on X86, something Intel hate. (heck: Intel payed of Nvidia almost a billion to cancel Nvidias X86 codemorphing ARM)
Only Microsoft can stand up against Intel by introducing own custom ARM hardware with DirectX12 support/AMD/Nvidia support. This is the only way to stop Intels abuse. Selling sand for 1000 dollars.TomJGX, triturbo, Robbo99999 and 6 others like this. -
Well, the 600 series launched in March, so that's even better. Close enough to what I want!
-
Nvidia 990M
Is it a GM200 or GM204? One thing most people have missed is that the GM200 is a A1 silicon. Doing a respin to A2 would take a bit under 3 month and enable power saving. Fermi for example was A3 before it even was released. Some chips even have base layer re-spins B series chips.
Its therefore highly believable that Nvidia is working on a GM200 A2 / A3 to enable power savings /less waste of wafers = mobile version possible including a dual 995X for desktops.
And who knows about TSMC. They say they are mass producing FinFET 16nm and that Nvidia is amongst the first customers. We will know more in a couple of weeks when we know if Apple is using Samsung or TSMC for their A series SoC.
Remember the 20nm myth? Cant do graphics. Its to hot? SemiWiki disproved it.
The real reason is that Apple needs insane amount of wafers 50-70K/month. We talk 100% capacity of a line. If Apple uses TSMC, then Nvidia/AMD again have to take a backseat.
Highend graphic market is under 3 million unit/year. Lets say a wafer gets 100 candidates. 30K wafers/year? Nvidia/AMD are nothing today compared to other customers. No wonder both Samsung/TSMC/GloFlo all adapt its wafer baking for SoCs. (Like FinFET++ 16) -
-
Hey guys,the long-awaited 2015AW18 is finally here and officially supports the 980m too. Link:
Robbo99999 and Cloudfire like this. -
wow, ghetto much?
Sent from my Nexus 5 using TapatalkTomJGX likes this. -
Skylake is coming next month, September 1 will launch in China... Rest of the world at September 27
moviemarketing, TomJGX, hmscott and 3 others like this. -
Good. Things seem to be on track so far.
-
Good for them. Now I hope they issue out a new BIOS for the AW18 R1 guys that upgraded to 970M and 980M and make them work properly.
AW18 will once again reign supreme in the notebook world. MSI GT80 can go home to the 1980s where they belong with their horrible design.TomJGX, Robbo99999 and TBoneSan like this. -
im a little limited in internet right now. yay work wifi.
Sent from my SGH-M919V using Tapatalk -
i knoooow right?! that GT80 design is sooo retro *lol*
-
TomJGX, Prema, triturbo and 1 other person like this.
-
As said above a bit late to the party and it would be dumb to think they are going to release a fix and make up for their last two years of crap they released and put some of us through that actually owned one.
Sent from my SM-G925T using TapatalkLast edited: Aug 19, 2015TBoneSan, TomJGX, hmscott and 1 other person like this. -
I'd be interested in it personally. especially considering the deopt for canada is a stones throw from my house. lmao. i wonder what they are gonna do with it.
Sent from my SGH-M919V using Tapatalk -
-
as for the aw18 id wait and see what it can do. my guess is run 980's really hot. lol
Sent from my SGH-M919V using Tapatalk -
I told them they should send it to me, as my house is the death march, with a hot room and massive CPU power requirements.
-
The 880M was hotter than the 980M. I doubt the AW 18 runs any hotter with Maxwell. My 980M is surprisingly cool.
Is there still a 990M? -
from what i heard there is, last i heard about it Nvidia was supposedly handing off TDW and how much of the chips where unlocked to manufacturers to build around and such. (source linustech tips wan show)
Sent from my SGH-M919V using Tapatalk -
No graphics amplifier support on the AW18?
-
-
@M0rdresh: srsly? with 980M in sli and 990M support? (based on that ebay auction...)
-
a GA would be silly with an 18inch laptop. and if it does support the 990m its probably gonna be a special sku as from what im hearing is the 990m isnt going to be mxm compliant, though rumors abound. im shure they could if they wanted to.
Sent from my SGH-M919V using Tapatalk -
-
Cloudfire likes this.
-
Damn then what do you call all those gamers spending big bucks on mechanical keyboards and scouring Craigslist and garage sales for vintage IBM Model M planks. No school like the old school?
-
So I used Alienware website's "chat with an Alienware expert" talked with some guy. He said the 18 inch model will probably have a 2016 version.
And I spoke with dell's customer service, who said the graphic cards are gsync model -
-
-
-
Sent from my Nexus 5 using Tapatalk -
@Cloudfire I prefer retro instead
-
-
Guys, it's just an Alienware 18 "R1" with 970M and 980M. People have had these since Maxwell came out.
TomJGX likes this. -
-
acer and lenovo with highend notebooks, alienware with soldered crap, msi with retro titan, crazy world out there... and clevo ftw
Sent from my Nexus 5 using TapatalkMr Najsman and TomJGX like this. -
I'm reading that website now.
nVidia 2015 mobile speculation thread
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, May 9, 2015.