There was a very extensive post on how to determine/calculate the best non-native gaming resolution in, however I have difficulties finding it. It had to do something with maintaing 2 non-native pixels per pixel or something (pixel doubling). Does anyone know which post I mean?
Hey man, I'm in exactly the same boat (and country) as you are![]()
-
Yeah, it's running at native resolution and setting 200% scaling in Windows. That's pixel doubling. Same thing that the Retina MacBook Pro does by default (2880x1800 native, pixel doubling results in same viewable area as 1440x900). There is no such thing as a "best non-native resolution." They all suck. The further you go below native, the worse it gets.
-
Yeah, I forgot to mention it was using NO LESS THAN 85% OF MY CPU at the point it was bottlenecking in. That's the most I've ever seen a game use, and I was offline on a fresh start (in school) so I had nothing running but Windows Media Player, Origin, Steam and Crysis 3 (no network connection). Oh and Playclaw 5 for temp/util checking. But yeah, that was it. Game was a monster.
Haven't you seen what I demand in a desktop to get me off of laptops? I want an 8 core from intel and like four titan X cards -
I used to get some CPU bottlenecking in BF3, resulting in perceptible microstutter at times, but it's better now. Not sure if it's the switch to Windows 7 or Nvidia's post-R337 drivers that have been responsible.
I have higher total CPU utilization in BC2, probably due to the more advanced physics/destruction, but it doesn't bottleneck my GPU and result in FPS drops, so it plays a lot smoother overall. -
A big part of the reason why I'm starting to appreciate "efficiency" and performance/watt is because of electrical limitations.
My room shares a 15A circuit with the living room, so the most I could safely draw from the wall is probably around 1200-1300W tops without having to worry whether turning on an extra light or two is gonna trip the circuit breaker. So I have to be really judicious about what I put in my build. Yes there are 2 20A circuits, one for the kitchen, one for the laundry room. I have no interest in setting up camp in either location. -
Is there any major difference between the GTX 970M 3GB VRAM and the GTX 970M 6GB VRAM except the VRAM?
Any performance improvement? -
No, same gpu with same frequency just half memory
-
Do you think its worth to pay 150 more for the 6GB VRAM Edition?
-
Yes, I think so.
-
Hum ... there is long debate on this forum about the utility of more vram ... but there is already some games that use 4Gb of vram... i think that the 6Gb version is more future proof. But first, choose in function of your budget, don't put your bank account in danger for a gaming machine.
-
Oh believe me; efficiency is a huge thing for me, especially with heat output and stuff. But what I meant was that the raw power I want/need in my current situation is no less than the i7-5960X. Which I know can draw serious amounts of power and all, but oh well XD.
That being said, that's my entire point of reference for why AMD simply making bigger/hotter/more-power-hungry cards to compete with nVidia is a huge problem, because at one point it just becomes way too inefficient. Unlike nVidia vs AMD however, haswell has no computational 8-core CPU counterpart... maybe broadwell might be cooler/more efficient, or even Skylake which is when I might be considering a new machine, but if I had to get a machine RIGHT NOW, you know what I'd be looking at. AMD needs to bring some competition to Intel too. Haswell is just not a good architecture as far as I see. Draws more power & runs a lot hotter for little benefit excepting a couple CPU instruction sets that nobody cares about and ~7% better computational ability (nothing to write home about). -
And you can make up 150€ extra pretty quickly if you can afford a 2000€ laptop.
-
150 Euro = $190.5 USD though. It's not a huge amount but definitely enough to make me pause and think. Or to put that in perspective, 7.5% of the total cost of the laptop for an extra 3GB vram.
I'd say it really depends on what games you intend to play, and how long you want to keep your laptop. If you're the type that buys new every year then absolutely no point wasting that money.
It'll be years before AMD can hope to achieve parity with Intel, so I wouldn't hold my breath. That said if Intel keeps dragging their feet like this, AMD may make a surprise comeback at some point down the line, hopefully, maybe, wishful thinking more likely but eh.
Broadwell may still have heat issues because of the FIVR, which Skylake won't have. Computation power-wise I did a rough comparison using SuperPI (classic single thread) and wPrime, and it appears Haswell has +100MHz advantage in single thread, and +300MHz advantage in multithread compared to Ivy Bridge, at least in those 2 benchmarks anyway.Tonrac likes this. -
Didn't get any response when I wrote this^^^^, any thoughts anyone about the upcoming DX12 GPU utilization?
-
We don't know enough about DX12 yet. Maybe ask this a year from now.
Anyway, it won't help laptops as much as desktops since laptops aren't as CPU-limited. Battery life yes, but performance not so much. -
Ok, we'll see. But maybe in the future I'll upgrade to something like a "1180m". Then it will be pretty relevant to have dx12 gpu utilization in heavy games, since i will not be able to upgrade my cpu on a GT72 when it becomes a bottleneck.
-
That's assuming you CAN use a 1180M... if by that time you're on pascal it probably will need a new connector.
-
I am reloading this page:
http://www.amazon.com/gp/product/B0...rd_p=1688200382&pf_rd_i=507846#productDetails
300 times a day! COME ON AMAZON! FINISH THIS MADNESS! -
Haha owned.
-
Crap... it says ETA early November for the 4k version. I guess I need to learn to read... but 1080p versions readily available.
@ Kaozm - that's me! I need that pin.Kaozm likes this. -
Yeah Amazon is awesome, especially their very generous return policy. Although I generally go with marketplace sellers due California's atrocious 9.25% sales tax, which really bites on big ticket items. Still pales in comparison to the 13% HST in Canada (well Ontario anyway), but enough to make me
.
-
Meaker@Sager Company Representative
Or 20% or higher in the EU
-
But the higher taxes are used to fund the excellent and well-developed social infrastructure, and the EU makes the US look like a 3rd world country by comparison.
-
And EU has better Internet too amirite? That's all I care about.
-
No no 4k version is not available on xotic, eta early november, BUT i already ordered 3k version from xotic, they couldnt get msata drives working and said it will take 2 more weeks to get new msata drives, whatever, then i canceled that order because 4k will release soon (this is the one with 6gb vram), so back to refreshing
HTWingNut likes this. -
Move to South Korea.
But seriously no country in the world can compete with them Koreans when it comes to internet (or Starcraft for that matter, hey I wonder if there's a correlation...)OfficerVajardian likes this. -
I'm waiting to see the thermals on 980m SLI 9377 clevos before buying one. Any information available yet?
-
Meaker@Sager Company Representative
The thermals should be a bit better than the 780m/880m levels.
-
ThePerfectStorm Notebook Deity
I got a question. I have an 860M laptop, and it serves me decently. I want to buy a high end 17" Single-GPU laptop (I'll only have the money in 2 1/2 months though), and want to know if I should buy a 980M or wait for 20nm. Questions:
1. What will the performance gains from 20nm be?
2. What will the 20nm top single-GPU's power consumption be?
3. Will 20nm laptops run cooler than 28nm Maxwell?
Thanks.ronferri likes this. -
1. 20nm likely will not exist, next is probably 16nm Pascal w/ stacked vRAM. Performance gains are impossible to guess but should be significant. Probably be 1.5-3 years though.
2. 16nm Pascal and no-one knows, probably not even Nvidia
3. No there will still be cool running thick laptops, and blistering hot paper thin laptops. Most laptop makers will cut down thickness by reducing fan sizes till heat is barely managed. They can sell laptops on looks far easier than on max temperature under load.ThePerfectStorm likes this. -
ThePerfectStorm Notebook Deity
I must say I'm shocked. No 20nm Maxwell??
-
Robbo99999 Notebook Prophet
DX12 definitely helps make the CPU more efficient, so you're right to a certain point, but your GPU is likely to become obsolete before the CPU performance, so it's a moot point unless you buy a laptop where you can upgrade the GPU (which is never possible without some kind of workarounds or compromises - e.g. modified inf drivers, modified BIOS, heatsink tinkering). -
Rumor is that there weren't significant enough gains to move to a new manufacturing process (I.E. little profit) and they would simply move to a new architecture on Pascal later.ThePerfectStorm likes this.
-
ThePerfectStorm Notebook Deity
So next year, what will their high-end GPU be? A rebrand?
-
there is! I guess internet speed is good for MMO games which they are really into it (ASIANS..Koreans..etc) heck they won't even sleep three days just to play these MMOs
-
We are in the same schuitje! What were your considerations to go for this model?
I for one was going to get the clevo P170SM-A with the gtx 980m, because i wanted a 17"
Not going for the GT72 because it has no optimus. It is said optimus is bad, but completely rebooting every time you want to swich from the iGPU to the dGPU and vice versa is even worse to me.
Then i saw the 15" gigabyte, and I wanted to give it a chance if the temps are right. It has a 3k matte display, gtx 980m, enough storage options and is slim, sexy and looks 10x better than the clevo. Only downside it that it comes pre configured with (slow) ram, all the msata slots filled with only 2x128 GB sticks and a 1TB HDD. I would like it pre installed with the hdd only, so i can fit the newly released crucial MX100 512GB in it, which can be bought for much cheaper than any 500GB drive and is much faster. That way i would still have the two msata ports and the hot swappable bay free for another 2TB of SSD space in the future, might i need it. Ah well, maybe there will be some resellers that offer more options
I currently own a 17" HP unit, and even though a new 17" would be nice, the idea of a well designed gaming monster that can ACTUALLY FIT IN MY BACKPACK is very alluring to me.
Then again, I will have to see whether the beast will throttle thermally (or due to the small psu of 180W, for that matter) or not.
EDIT: Its probably the screens more than anything that drew me away from 17". 15" gets up to 4k displays, and almost all of them are IPS instead of TN (blergh)!
Sent from my GT-I9300 -
Last edited by a moderator: May 12, 2015
-
A 230W power supply would pull 1-1.2A at 230V from the wall max, depending on load and psu efficiency. (Unless youre american, you get 120V from the wall right?)
Sent from my GT-I9300 -
21% in holland
(+25% if youre importing from the outside of the EU (and get caught in the process))
Sent from my GT-I9300 -
Yeah 120V from the wall, so 15A circuit = 1800W max. Although you probably don't want to go over 95% of that so really usable power is about 1700W if not less. That comment was actually referring to desktops with hex/octa core CPUs and multiple high end GPUs that can easily draw over 1000W from the wall when overclocked.
Also, you guys are going to love 980M SLI and DSR:
Tonrac likes this. -
edit: double post
-
Why did the 980m dip down below the 880m around 2:07 mark?Last edited by a moderator: May 12, 2015
-
No. A full GM204 with 2048 shader cores. Think 680M to 780M, that's what it'll be like.
Very puzzling isn't it? I wonder if their written companion article can shed light on this strange anomaly. -
Meaker@Sager Company Representative
A background process or differences between the two scenes.
-
ThePerfectStorm Notebook Deity
Ok. So it will end up being a downclocked GTX 980. -
Source?
Sent from my Nexus 7 FHD -
https://www.youtube.com/watch?v=VGSajLEWkuI
and at 2:17 there is a sudden drop in performance. Throttling?
GTX 900M series officially announced by NVIDIA!!!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cakefish, Oct 7, 2014.