They COULD use 6GB vRAM, but that leaves nothing for the game itself. I was simply saying that the entire 5.5GB (X1) or 6GB (PS4) memory pools must be used for both what we call system RAM usage (I.E. ~2GB on average for most games) and vRAM usage. I.E. if a game on PC uses 2GB RAM, then it should use something similar on a console, and thus not ever use more than 4GB on PC (as that would surpass the limit the consoles have). If a game uses 6GB+ vRAM on PC while using 1.5-2GB RAM, then that's way beyond console limits and FURTHER points to serious unoptimization.
And yes, sales work for essentially dead games in that way. I don't blame you for it. But the main point is that we need them to know they're doing things wrong. If they want to get pissy about it because their sales aren't high when their game makes BF4 look as demanding as CoD 4 and their game LOOKS like CoD 4 itself, then that's their own fault. But it's what is happening now, and we need to get rid of that problem lickety split.
-
-
-
Nvidia Geforce GTX 980M Mobility Maxwell Specifications and Benchmarks Revealed
Not sure if confirmed but would be reallly cool, what you guys think? -
if you watch the preliminary bench result of the GTX980M, the notebookcheck specification seem to be more accurate
Bench results show that the 980m seem to be an underclocked desktop GTX970. -
4710HQ: 3.5GHz with HD 4600 iGPU
4810HQ: 3.6GHz with Iris Pro 5200 iGPU
4980HQ: 4.0GHz with Iris Pro 5200 iGPU
Distinction is really with the dedicated GPU
And HQ = Soldered, MQ = Socketed
I would guess the 4980HQ will definitely throttle except with the most efficient of coolers, and soldered CPU's usually are geared more towards the thin and lights. 4810MQ will likely throttle as well. But it may be worth it for the Iris Pro at least for gaming on battery.
octiceps likes this. -
LOL. It has been like this since around 2004 when consoles started to gain more traction. I noticed how little they care about us when I played GTA IV. One of my favorite games of all times and nothing on the planet can run that without problems. Then Rockstar never released Red Dead Redemption for PC. Too much effort because "there is no PC market." There were 7 million people at a single (!) DOTA 2 tournament and 67 million playing LoL. Definitely no PC market... At least a million people on steam a day, at least. Valve is showing all these morons. And then GTA V was praised for their business practices because they made a billion dollars in a week. I think it cost them over $200 million to make the game. No one mentions how COD made a billion when the latest game came out and COD doesn't take 5 years to make; there is one every year. These guys are morons. They say there are not enough PC copies sold but they only mention physical copies sold. Valve took a survey and most users do not have more than 4gbs of ram. You don't need it. I am surprised you guys didn't mention the new consoles use an 8 core processor; I have a feeling games are gonna optimized around that too.
octiceps likes this. -
I tested a few older titles with the Aorus x3+ to see if iris pro was any more efficient than the 870m it packed. On average, I was consuming 10 percent less battery with the iris pro than with the 870m. Not worth it if you ask me, considering the cpu costs so much more and you have to manually force it to use the cpu when you want the savings.
-
Ningyo likes this.
-
Ok, haven't been following news too thoroughly so am a little out of the loop - no new laptops announced yet?
Please don't say they're holding out for broadwell next year... -
Ningyo likes this.
-
Good to know, thanks.
-
Coming back to 16nm. 16nm finfet has already been demonstrated to show better gate density, fewer leakage and better reproducibility by various R&D labs. AMD's defacto foundry "Global foundries" heavily focused on 14nm rather than 16mn becasue of Intel's competition (14nm broadwell). They have also started research on 12nm FYI.
http://globalfoundries.com/docs/def...dries-14nm-collaboration---final.pdf?sfvrsn=2
TSMC on the other hand announced 16nm finfet adoption which pleased major tech companies. Now that Nvidia's roadmap shows Pascal in 2016 (Remember roadmaps can change when issues come up), it's assumed they will probably go for 16nm. Many companies have shown their dislike and moving away from 20nm.
http://www.extremetech.com/computin...ds-if-you-ask-the-engineers-or-the-economists
http://www.extremetech.com/computin...m-finfet-tapeout-of-big-little-cortex-a57-soc
As I stated above there is a 1 yr gap between mass production capability in TSMC fab, to tapeout and production of actual orders. Which means if TSMC announces 16mn mass production capability in Q42015, assuming no unforseen hurdles (unlikely - the smaller the fab limit, the more problems - aka nanotechnology behavior limits) 2016 is the likely year. GM204 and GM200 is plenty capable. Also NVD has invested a lot in changing the architecture (Kepler to Maxwell) therefore IMO makes no sense to change node within a year (2015) when (2016) 16nm is going to be ready. Nvidia would have already started research on Pascal, initiated trials on its architectures and node shrink. They also need to finalize chip design before finally submitting orders for mass production (Basically too busy to be thinking for 2 node shrinks ie. 20 and 16nm within a 1-2 yr timeframe). I think they will just go ahead and milk out Maxwell 3.0 on 28nm next year until Pascal 16mn matures. No bets just a gut feeling.LostCoast707 and Ningyo like this. -
I expect 20~Nm Maxwell August-September 2015 and Pascal OCT-NOV 2016 -
octiceps likes this.
-
What I intended to say in my previous post is that there is no logical sense in doing 20nm and then doing 16nm all in 1-2 yrs timeframe. Not a good idea from a semiconductor company perspective (technology and business wise) -
Sweclockers made a boo boo and changed the specs for GTX 980M down from 2048 cores to 1664.
Shocker -
Most FPS/ RPG games on the other hand scale up the size of objects so the same amount fit on the screen, they are just in higher detail. In these cases you are absolutely correct, the impact on vRAM usage is extremely minimal. And since these are the types of games that normally use by far the most vRAM this is normally not an issue.D2 Ultima likes this. -
lol have you guys seen the VRAM requirements for "Shadow of Mordor"?
6GB+ for Ultra, 3GB+ for High.
Recommended specs:
-
The Evil Within:
I`d say that one should prepare for more games to follow this in the future, now that we get the PS4/XBox1 consoles using the same architectures and they will do these ports over to PC with high requirements for VRAM and specs.Robbo99999 likes this. -
-
Hi,
I want to buy a new notebook till the 20th of October, and i wanted to know if anyone knows if the MSI GS70 gets the Nvidia GTX 970M or the 980M.
Are there any other light notebooks(15" and 17") which will get the 900M Graphics Cards except the MSI GS60
I know these ones already: MSI GS70, MSI GS60, Asus G750 Series and the Gigabyte P25x V2.
Which one do you recommand most?
I would till now prefer the MSI GS70 most, since its the lightest and for me the cheapest for the hardware it has.
Any other recommandations? -
We can all blame Watch Dogs for this. Thankfully I have 0 interest in either of those games.
-
TBoneSan likes this.
-
I have no idea what to pick out of those notebooks. Never used any of them before.
Aorus will also come with GTX 970M SLI btw
MSI will also offer GT60 with GTX 970M
http://www.acadia-info.com/materiel-pc-grossiste/11260-MSI-GT60-2QDDominator-1077XFR
There is also a GT60 version with GTX 980M coming, but its not listed yet -
The same people complaining about current-gen VRAM requirements, are the same ones who berated me for saying that system requirements would raise significantly with the X1/PS4 release.
Learn to forecast, 4GB will be the minimum from here on out.Tonrac likes this. -
-
Yeah. People are quick to bash the consoles but forget that previous consoles had so low requirements which were hindering overall requirements. Thanks to the new consoles we will see a rise in overall quality and everything will be pushed to the limit. The problem will be the requirements will have a hefty tag from now on.
We already see the HDDs requiring 50GBs for games, RAM and vRAM getting a huge bump in requirement etc, and now multicore CPUs are even minimum requirements. -
It's very different using an asset and running it at a higher resolution, than using very high resolution assets to begin with, then running them at high resolution.
Using the same assets and running high resolution has low impact on vram because it becomes extra information added for the image, rather than loading huge files onto VRAM. It was due to this that microsoft created tiled resources, so that you can load huge assets to RAM while keeping vRAM low since it is kind of a waste of memoryNingyo likes this. -
The problem is that game developers are developing for the unified memory of the consoles... If these games were optimized for the PC, the requirements would go down. They don't want to spend any resources doing that though, regardless of the pain it dumps on their paying customers.
LAZY developers is why the requirements are going up and we have the option to choose not to buy their games.octiceps likes this. -
-
-
The Aorus is too expensive for me.
I would Prefer a Notebook for max 1600€ which is 2030$.
I found the Aorus X7 with the Nvidia GTX765M 2GB SLI for 1800€ which is way too expensive for me.
Any idea what the prices are going to be for the Nvidia GTX 970M/GTX 980M Notebooks?
I need the product shipped to Austria, Europe. -
-
6GB of vRAM on Ultra at FHD means nothing but game is HEAVILY UNOPTIMIZED. Game should have a changeable parameter of LOD distance or draw distance etc. That should be fairly easy to change in engine settings.
Something is telling me that in one month this game becomes less performance hungry. -
@LemonLarryLP: You might fit an Asus 751 with 970m and no SSD in that kind of money..or something with an 880m with a healthy discount.
-
-
-
-
you should see wolfenstein tho, that game runs completely sh!t if you don't have 3GB+ VRAM.
what I don't understand how the hell are they taking up that much VRAM at 1080p. if it's 3k or 4k at least it's understandable, but at 1080p it's just ridiculous, and what's most interesting that the amount of VRAM a game requires is usually inversely related to the actual quality of its textures, the world is a funny place.
wolfenstein has sh!tty graphics regardless, titanfall is much less amazing than bf4 with the same amount of VRAM, watchdogs runs BETTER with LESS VRAM after the mod.
I think we can draw some conclusions here -
Since people seem to keep getting their numbers wrong, this list should in general be in order from fastest to slowest, though the 4910MQ has a 8mb L3 cache all the rest only have a 6mb one, which may push it above the 4980HQ in some tasks.
i7-4980HQ 2800 ‑ 4000 mhz (Iris Pro)
i7-4910MQ 2900 ‑ 3900 mhz (8mb L3 cache)
i7-4810MQ 2800 ‑ 3800 mhz
i7-4860HQ 2400 ‑ 3600 mhz (Iris Pro)
i7-4710HQ 2500 ‑ 3500 mhz -
Anyway, yes, I know. The mod, as I mentioned earlier in one of these threads today, repacks the game with the same "ultra" textures, but the stutter is gone, because "ultra" is coded to cause stuttering. It's just bad.
Also, 3GB of vRAM at 4k when it takes 2GB at 1080p is something I could get behind. But it doesn't mean 4K needs 4GB+ vRAM. That's STILL bad optimization. If you're interested, read my vRAM information compilation in my sig. You'll probably have MORE hatred for it after you internalize all that information haha. -
Also, you should find the official possible OC capabilities. If the HQ chips have 0 OC potential, then 4800MQ+ is superior to them all in terms of possible performance capabilities, as it can hit 3.9GHz (or more, with bclk OCing) and that might very well surpass every single HQ chip out there.Ningyo likes this. -
True I was responding to all the posts with the incorrect max numbers, its true there is more to it like you say.
They actually do benchmark in that order in most benchmarks though, over-clocking is another story, I have no clue on how they handle that.
oh and the first of those numbers was the base clock so I did mention those too. -
-
-
Give or take. That extra 400mhz on four cores might be better; probably right. But I think you could hit 4.0ghz stable on the 4900mq and I have read bad experiences with overclocking more than that. I think it was on the NP9377 thread with the 4940; really hard to keep stable overclock without throttling. But then again I think it was around 4.5ghz which is pretty good but was not stable. Would be a lot better to just make 6 core mobile cpus. Qualcomm is releasing octacore mobile processors next year. Maybe possible with skylake.
-
Meaker@Sager Company Representative
octiceps likes this. -
the ps4 cpu is actually weaker than the ps3 cpu
cpu wont bottleneck us anytime soon unless you are going to SLI or use external monitor for 144hz, no single mobile gpu fast enough yet to be bottlenecked by a cpuHellmanjk likes this. -
XD -
GTX 980M / 970M Maxwell (Un)official news
Discussion in 'Gaming (Software and Graphics Cards)' started by HSN21, Sep 18, 2014.