There has also been a listing of an Asus ROG G751JY-T7054H model in France, the original link is this.
It has been pulled since, but you can see the configuration in the description. The price was around 2,500 EUR
Edit: The page is still available in google's cache, sans the price though. See here.
-
What is the difference in temperature on those cpu's? (same rig) cant find any..
Edit:
4710HQ
4860HQ
4980Hq -
I7-4810 = 2.8-3.8 GHz
I7-4860 = 2.4-3.6 GHz
going I7-4710 to I7-4810 is probably a slightly larger gap than I7-4810 - I7-4980
be careful though the numbering systems are very misleading. Higher is not always better, and sometimes you pay a ton more for one than another just because it has a better integrated GPU which is near useless if you have a dedicated GPU.
EDIT: those are all 47 TDP, some may run hotter or cooler, but they are all very close. Also even if a 4810 were to run hotter at full load than a 4710, it would likely run cooler than a 4710 at an equal load. So it will be very dependent on the exact usage.
If you need a cooler CPU the 4702, and 4712 are both 37 TDP, they do run at lower clock speeds though. That should not be a problem in gaming unless maybe you have a SLI system, since very few games bottleneck on the CPU.
On the other end there is the 4930mx which is 57 TDP, which should be even a bit faster than the 4980HQ but also run hotter.Cloudfire likes this. -
You guys sure make a CPU more complicated than it really is
@Ethrem: How do you know 4980HQ will throttle? The CPU was part of the Haswell refresh. All prior architectures get a small speed bump but keep the same TDP. Isnt it part of maturing silicon from Intel?
@Ningyo: I think he meant 4710HQ because we talked about it yesterday.
I agree with what you say thoughKaozm likes this. -
Sent from my HTC One_M8 using Tapatalk -
And what do you think about the mq version? like the 4710mq?
-
-
I think throttle is more of a function of how good the cooling is, not the CPU model, especially since you can use XTU to set power limits.
-
-=$tR|k3r=- Notebook Virtuoso
GEE WIZ! For the 'GTX 980M / 970M Maxwell Officially announced' (still waiting for that one) thread, there is awful lot of CPU 'Dam di dudu du du'in going on here.
-
There is only a very small gap in performance between the 4710HQ and the 4810HQ.
There is only a very small gap in performance between the 4710HQ and the 4860HQ.
There is only a very small gap in performance between the 4810HQ and the 4860HQ.
There is a significant gap in performance between the 4710HQ and the 4980HQ. -
What you need to know about the CPU is to make sure it doesn't cause a bottleneck which means it is to slow to keep up with the GPU and it causes the GPU to do everything, including the things the CPU is supposed to do and therefore detracts the GPU from what it is supposed to do. That is all you need to worry about. Same with RAM. You just want a balance that doesn't cause a bottleneck. People who need CPU and RAM power use it for specific programs that demand it. I really do not know what kind of people those are but if you do not know you need it then you do not need it.
I knew another person who has been in the computer business for 30 years. He said he bought a Pentium for $1000 when it came out. It was supposed to be a revolutionary thing. After 6 months he sold it for $200. But what happens if you buy a CPU for $200 that gives you the same gaming performance as the $1000 one? You do not lose a lot of money. I know from experience. Form fits function. Get what you need. You are a gamer I presume. The CPU will not make that experience better.
CPU can help if the game is CPU intensive and uses mult-cores like Crysis and BF4 but you will be able to get over 60FPS easy with 900 series plus they don't have 6 core+ mobile CPUs. Those games use hyperthreading and other CPU functions. Not a big deal though. -
My 4940MX will do 3.9GHz @ 89C but only will sustain that speed if I jack up the TDP, something you can not do with non-MX chips. It takes 70W for my chip to sustain 3.9GHz but it will do 3.6GHz all four cores and 3.7GHz on 3, 3.8GHz on 2 @ 47W as long as I have a -85mv undervolt as well.
Sent from my HTC One_M8 using Tapatalk -
-
Sent from my HTC One_M8 using Tapatalk -
There's thermal throttling, then there's TDP throttling. If that 4980HQ is still pegged at 47W (57W turbo), it may throttle due to the TDP cap depending on load.
-
Does anyone think that Broadwell H, 47w will run cooler than Haswell? would be great to get overall temps down..
-
-
Sent from my HTC One_M8 using Tapatalk -
Ok let me clarify how CPU bottlenecking in games works.
First the CPU calculates what will be in a frame, this can include things like NPC movement, movement of the camera(player) through the world, are events occurring, etc... (sometimes parts of this that require a physics engine are partially handled by the GPU depending on the game)
Then it sends this sort of sketch of the frame to the GPU which renders all the objects and figures out the exact final picture to display on the screen.
The GPU then sends this picture to the screen which goes ahead and displays it.
_____________________________________________________________________________________________
Now lets pretend we have a CPU that is creating 70 frames every second and sending them to the GPU
And we have a GPU the is rendering on ultra settings 50 FPS to the screen
And a screen that has a max Hz of 60
in this case we get essentially the lowest of these speeds which is the 50 FPS rendered by the GPU
now if we drop the settings to say high and the GPU renders 65 FPS to the screen, now we get a bottleneck at the screen, since it only updates at 60hz, we still only get 60 FPS even though the GPU and CPU can handle more.
now lets say we switch the graphics to medium so the GPU renders 90 FPS to the screen and give it a 120 Hz monitor. Now the 70 FPS the CPU is sending to the GPU is the bottleneck and we end up with only 70 FPS.
___________________________________________________________________________________
As of right now I am pretty certain NO i7-4xxx CPU will bottleneck (throttle) any modern game to under 45 FPS, most games that will be more like 90-300 fps. As you go up into the more powerful ones like the i7-4810 you likely will not find a single game that bottlenecks on the CPU at under 60 FPS.
This means you really do not need to worry about CPU throttling UNLESS you have a 120Hz+ monitor and at least a 970m SLI, OR you have a 120hz+ VR headset you may drop graphics settings in to maintain the 120+ FPS
_______________________________________________________________________________
To put things into perspective a i7-3610QM (less powerful than ANY i7-4xxx CPU) probably causes throttling in the following games: (out of about 30 titles)
Thief 2014 (probably throttles to about 45-50 FPS)
Assassin´s Creed IV: Black Flag (about 42-45 FPS)
Company of Heroes 2 (about 51-53 FPS)
So if you paired it up with a 880m SLI you might lose a few FPS on ultra settings
TLDR: unless you KNOW you need a better CPU just don't worry about itHellmanjk likes this. -
.
TSMC builds world’s first 32-core networking chip using 16nm FinFET process technology
Looks like TSMC is having an early Thanksgiving
TSMC did state 16nm finfet mass production by Q42015. Seems like their R&D process is now turning over to a manufacturing stage. This is very important news as it shows Nvidia would probably switch over to 16nm rather than porting to 20nm ( rumor by Wccftech concerning GM200 leak).
This sets a precedent for AMD to follow suite as well. We also know Samsung is heavily investing in 14nm/16nm as well. Looks like Q2-Q4 2016 could be an interesting year for a node change.
Certainly sounds nice on paper. Really excited for a high power SoC implementation soon. -
I don't see Nvidia riding out the entirety of 2015 on 28nm. They're a business and they need constant product to make money. 20nm cards are their only choice.
-
Very expensive affair in translating maxwell to a new node when 16nm is around the corner IMO.
Business sense would dictate milking the GM204 for 1-2 yrs until Pascal 16nm releases just like Kepler was milked for 3 years - oops! GM204 is now far more superior than any kepler or kepler refresh cards. Further tailoring and optimization can expand Maxwell 2.0 performance. -
Also that Q2-Q4 2015 could be an interesting year for node change? Not 2015.
Cause waiting till the second half or later of 2016 seems like a long wait. -
-
4900MQ only costs $200 more than the 4700MQ, and the 4930/4940MX is unlocked so there's a lot more performance to be gained than at stock if your laptop's cooling is up to snuff and you don't get a dud. 10% could also be the difference between having noticeable stutter or no stutter, especially if you're under 60 FPS.
Did I recommend spending $1000 on an MX CPU? No so please don't put words in my mouth. I was simply pointing out facts and the facts are that a more powerful CPU does give more frames, especially in CPU bound games. Whether it's worth the extra expense is up to the individual.Mr Najsman likes this. -
-
Hellmanjk likes this.
-
-
-
980m spec info on wccftech but that 2000+ cuda core is prob fake. 1660 sounds more realistic, time to wait for 20nm
-
Sent from my HTC One_M8 using TapatalkNingyo and Firebat246 like this. -
I will trade my 4800MQ for your MX Ethrem :laugh:
-
"As of right now I am pretty certain NO i7-4xxx CPU will bottleneck (throttle) any modern game to under 45 FPS, most games that will be more like 90-300 fps. As you go up into the more powerful ones like the i7-4810 you likely will not find a single game that bottlenecks on the CPU at under 60 FPS."
on the second weakest i7-4xxx CPU it causes
Skyrim bottlenecks to 58 FPS (or 63 on the MSI GT70 they list)
Starcraft II: HotS bottlenecks to 55 FPS
Grid II bottlenecks to 62 FPS
note all 3 of those are well over 45 FPS and even a 4710 would likely push them over 60 FPS
Also since the FPS went up for all 3 in the lower graphics settings, that likely means you could change a few settings to regain some of the FPS, the games are probably not optimized to use the GPU physics or such. In all likely you can manually add a line to the config files to fix that I know you can on Skyrim at the least. Either way though any i7-4xxx CPU should be able to handle a 60Hz screen just fine.
Edit: really that was foul language you really need to (unusable synonym for making small adjustments) your filters. -
I too want Nvidia's newest Maxwell 4910MQ with a 57W TDP, 8MB of L3 VRAM, PCI-Ex 3.0 already integrated into the GPU and supporting upto 32GB of normal RAM. Oh forgot, it must be a 4 SMM core model with an Iris Pro IGP for maximum performance!
I'm sure Age of Empires II will run 23% better on it
EDIT: This thread is becoming like a drunken driver at a highwaydrifting from one lane to the other.
FuDoW likes this. -
-
Ok this is just in. behold
As we are moving from PS3/XBOX360 (512 ram) to XBOX ONE/PS4 (8gb ram "only 5.5 is usable for games currently) the jump in vram is going higher and higher
I hope that 980m has 8GB VRAM , getting the 4gb version (if there was one) WILL hold you back, according to the developers the console version is equal to HIGH but PC version has ultra so there is that
and lol at people that bought the recent 970/980 desktop i always said 4gb is not enough specially sooner when all games will move away from ps3/xbox360
6GB For 1080p Ultra, will probably be 7GB on 2560x1080p to me my next gpu must have 8GB ram no questions asked. -
I think that screenshot is a serious problem. I think vRAM usage is going through the roof and the looks are not compensating. I don't see why a game has to use more than 4GB vRAM for any reason at anything less than 4K, and even at 4K it shouldn't NEED a whole lot more. This looks more like devs can't code and are complacent due to the fact there is a truckton of free memory (and has been for years now) and they're not constrained to 512MB on X360 anymore.
-
-
-
-
The ultra texture is suppose to be superior to the consoles, and consoles now already has 6gb Vram (2 for OS) and the whole to the metal thing+optimized etc consoles can actually make better use of their power/vram
I expect 8GB to become standard in a year for NEXT Gen games only (upcoming assassin's creed unity/batman arkham knight/etc)
Watch dogs saidf it wanted 3GB but in reality you need 4GB, 3GB was minimum and without turning on AA etc with max settings+high textures the game was a mess, glitches,stutters,etc unless you get 4gb vram or reduce textures
This was ubisoft response in twitter
Watch Dogs can use 3+ GB of RAM on NG consoles for graphics, your PC GPU needs enough VRAM for ultra options due to the lack of unified mem
If you experience lag/stutter on a fast PC, try to lower one of those settings to reduce the GPU VRAM usage: texture quality, AA, resolution
Making an open world run on NG & CG consoles + supporting PC is an incredibly complex task, the team did a fantastic job. Congrats guys! : D -
Except Watch Dogs on PC was downgraded and optimized like sh*t, everyone knows this. Sure, get as much VRAM as possible if all you're gonna play are crap console ports. Check the Steam Hardware and Software Survey, 1GB VRAM is by far the most prevalent. Smart PC devs develop and optimize for as large of a target audience as possible, they're not gonna cannibalize sales and reputation by catering to some niche crowd with absurd amounts of VRAM. It's only the inept and frankly disinterested companies like Ubisoft that don't care since the majority of their sales come from console anyway.
-
there are 3 companies just posted here that requires 4GB or higher and we are at the early cycle of the new next gen consoles ports it will only go higher from now on, thing is watch dog textures are mostly crap too lol -
Yes there is something you can do about. It's called speak out and vote with your wallet. Don't buy games if they're crap, simple as that. The PC platform is too much an embarrassment of riches to get tripped up over the inevitable lemons. You're missing the big picture here.
-
Funny thing is I finished Watch Dogs before they even released the first patch LOL. Game itself was good but yeah optimization was a sack of excrement.
Also, 6GB of vRAM at 1080p is just plain wrong.heibk201 likes this. -
Also, Watch Dogs was a joke, because there's a guy who re-packed the Ultra textures in a mod that let you turn them on with the "high" setting... guess what? Stutter went out the window, even though 3GB+ vRAM was still needed. Watch Dogs was CODED to perform badly for PC. Period. To give console users the false idea that their systems can compete with people using 2 Titan Blacks and stuff at a $400 price point. The Division is going to do the same thing. The assassin's creed games are likely going to do the same thing. Far Cry 4 is almost definitely going to do the same thing. They want console parity, but since they have no excuse as to why they cannot produce better graphics on the FAR stronger PC platforms, they just make it run awfully and claim how we need stronger hardware that doesn't exist yet.
As for the vRAM usage debacle, it's a joke. They are just being sloppy with their optimization, and they aren't even using very good textures either. High resolution poop is still high resolution poop. It's why BF4's far crisper than Titanfall despite using 1GB+ less vRAM. It's why Sleeping Dogs' textures for most of the game using the high res texture pack match or surpass a LOT of watch dogs' textures (without using E3 2012's settings), while doing so at 1.5GB+ less vRAM. It's called good quality, rather than high resolution bad quality. There's no excuse. I don't care what they say, if your game NEEDS more vRAM than Crysis 3 and it is not for a massive cache for super fast motion through a level and/or extremely powerful sniper rifles with great draw distances (like Arma 2/3 uses, etc) then your game better look better. But they don't. And HDD space is going through the roof for simply installing them, and they're taking more and more from the systems to look as good as games that've been out or could have been out YEARS ago, while running worse.
We should never excuse bad coding. We should never give them a free pass and let them say "oh well it's just advancement". I mean if I'm telling someone to buy a new card I'll tell them to splurge on vRAM, but that doesn't mean I really think it's good that we SHOULD. I mean I have 4GB vRAM and all, but that doesn't mean I think every new game under the sun is supposed to use up all of it. And they are. Pointlessly.Ningyo likes this. -
watch dogs texture looks like crap compared to witcher 2 on 1.5GB VRAM only at ultra, i usually buy badly optimized games during sales only at 5-10$ max -
Robbo99999 Notebook Prophet
I agree with you though when you say "any i7-4xxx CPU should be able to hande a 60Hz screen just fine" - in fact I'd go as far to say that any i7 CPU from Sandybridge and up is perfectly fine for 60fps gaming. If you want to go 120Hz gaming, then get the best CPU you can afford, which will help out in a few games to get closer to 120fps, but of course not at the expense of getting a lower GPU - buy the best GPU you can afford, and only then the best CPU you can after that (120Hz gaming).Ningyo likes this.
GTX 980M / 970M Maxwell (Un)official news
Discussion in 'Gaming (Software and Graphics Cards)' started by HSN21, Sep 18, 2014.