Indeed we need the option for 3-4k or whatever on laptops. Even windows tablets sport higher res. Phones are now using ridiculous high density screens that are beyond silly at this point.
3K should be a standard now. We don't need to game at that res, but it helps with productivity and overall quality of screen.
-
-
D2 Ultima likes this.
-
I want a 1440p 120Hz 5ms or better black to black response time monitor for laptops. It's not that hard to do.
That being saiddddd, it would be EXCEEDINGLY stupid of nVidia to release the new flagship that doesn't beat the old card's single-GPU flagship. Even if it cost less, it's pretty much a cheaper sidegrade and will have immature drivers for a couple months.
If the 980M was 780Ti performance, awesome. Especially moreso if it was better. GK104's best card (680/770) was clearly proven to work without issue in the mobile market; I think GM104 can do the same, and GM104 should be the GTX 980. Unless mobile gets GM204 first... in which case it definitely will smoke anything previously existing including desktop hardware (barring supreme gimping of the card releases). But yes, 980 not beating 780Ti is VERY dumb for nVidia unless there's some 980Ti card out there that's like 1.5x Titan Blacks.Cloudfire likes this. -
Atleast you have a display that can do 1200p, 1440p, 1600p etc etc. You cant do any of those with a mediocre 1080p display
Makes no sense to not have a GM204 card that doesnt beat the previous architecture.
GTX 680 was like 25-30% faster than GTX 580. -
Well let's just hope our 980M is a downclocked GTX 980, because we already know Gx104 can be handled on mobile =D. THEN we'll get some powah.
Cloudfire likes this. -
-
-
-
Come on Alienware make a 20" 16:10 (17"x10.6") laptop exactly the same size as the alienware 18 but only 1.5" thick.
This leaves a 13mm bezel on each side and 29mm on top and bottom.
Give us a 4k OLED 120 hz with at least 80% sRGB.
And of course drop in a 2 or even 3 x 980m SLI after all Maxwell runs cooler and draws less power. That should be able to get decent to great framerates at 4k resolution depending on the game. -
Even 1080p on 17.3" gives me eye strain (I don't have the best vision admittedly). 3K and 4K on smaller screens is just begging for eye problems unless MS fixes their scaling issues in Windows. Until then I don't think 3K or 4K laptop panels is worth it.
-
I think 1080p is fine for another generation, for gaming. 2016+ is when I'd expect to see 4k.
-
-
I thought the much larger L2 cache on Maxwell reduces the memory bus/bandwidth needed? -
The architecture is just very different so directly comparing cores and bus width don't give an accurate picture of performance.
Thats why I based my estimate on the average 10% better FPS in real game benchmarks over the 860m kepler. Combine that with it using 45w instead of 75w and you get the 1.85x performance per watt I estimated. Of course it does depend on the particular game as to the exact gain in efficiency. -
Yes, the arrangement of the shader blocks have been changed on Maxwell to save power, but I'm talking about how it compares to Kepler on an individual CUDA core basis. Again, normalizing clock speed and forgetting power and memory bandwidth (where Maxwell is obviously more efficient in both areas) and focusing only on shader performance. It seems to me as if Maxwell and Kepler with equal core count perform the same, in contrast to Kepler vs. Fermi where Kepler needed twice the core count to perform the same due to dropping the shader clock, which operated at 2x core clock on Fermi.
-
Hmmm... New Maxwell laptop or this?
MSI Unveils the GS30 Shadow Laptop and GamingDock | techPowerUp -
Ha, that's ridiculous. Defeats the whole purpose of having a laptop.
"Let me buy this thing that's supposed to be portable, and then buy a box that makes it unportable. Or wait, maybe I should just get a desktop?" -
I'm more of a gaming at home kinda user.
While portability outside.
Coz I use the same laptop for work and gaming. -
-
irfan wikaputra Notebook Consultant
however, this rule is derived from a binary perspective of exponential
about AMD, I am not really sure (never had one lol, not that I hate them xD) -
Tis's a great alternative for me.LostCoast707 likes this. -
Ever hear of an HDMI cable? SLI will offer better scaling and performance than a docked box.
That technology is in its infancy. Assuming it kicks off, maybe one day it will be better. -
Most of the time I'm outside meeting clients.
I'm also getting older. LolLostCoast707 likes this. -
Seems like the best combination for you would be a desktop at home for gaming and work, and a laptop specifically for work.
-
irfan wikaputra Notebook Consultant
definitely deadsmiley, SLI of GTX x70M is worth over x80M cost/performance ratio wise. we the x70M is just a shy shadow behind the top dog -
you can probably build a better desktop for cheaper than that docking station. likely even have it be a smaller form factor too. I can see the lure of those 1-2 inch docking stations some laptops have but that thing is bigger than some desktops.
-
-
never say never. There are ways it could theoretically be done (not likely with maxwell though). For instance you could have a memory controller that only alters bits of RAM that need changed leaving ones with the proper value alone, sort of like a gif image will often only save the altered pixels in an image for consecutive frames. Not saying this particular method would be worth using, but its entirely possible at some point someone will come up with a method to make an equal size/speed bus be more efficient.
-
But still end up getting a laptop only instead. Love the feeling of working and gaming on the same device.
Or I should just wait for the Aorus X3 Plus with a GTX 970M. For both of the best worlds. Assuming GTX 970M perform as near as the GTX 770. Lol -
irfan wikaputra Notebook Consultant
but just prepare some ice to cool it down the road lol xD
to be honest I'm almost at the same boat with you.
I'm mobile on the go, need to meet clients, etc
however, my company gave a mediocre hp elitebook
and I have P377SM-A sitting on my table but still portable enough as long as you're not climbing the mountain with it on your back
If the company didn't buy the Elitebook, I would've gotten myself your system, the clveo W230ST with maybe higher wattage adapter -
That clevo of yours is already so powerful that I don't think you will like having a W230ST for gaming, even it is for on the go.
Let us wait and see how the new Maxwell performs on mobile. -
-
EDIT: Forget it. It was an estimation
-
-
-
-
as you say memory bandwidth is more of a being past a certain threshold dependent on the particular game. However that threshold will probably get a lot higher somewhere between 2-5 years from now. Over the next two years 3k, or 4k resolution will likely become the norm. This will not directly change the required bandwidth by much at all. However having far better resolution will make lower resolution textures and shadowmaps far more obvious and let you actually see the added detail on higher resolution ones. So as the 3-4k screens become more common games will focus on improving the texture resolutions. Figure at least 2-3 years programming for a AAA game and you are looking at 2-5 years from now for the memory bus to become a major bottleneck.
That said its hard to say exactly when or how much larger of a bus you will need, just that it likely will need more in the future.
Of course that only matters if you plan to use the same laptop for 4+ years really. -
Memory bandwidth does need to scale up with core config though for obvious performance reasons.
-
Basically, I'm saying that something like 120GB/s+ is the point where memory bandwidth starts being less important for games unless you REALLY need to fill/empty that framebuffer quickly (which rarely happens). Since the 750Ti and 860M are both well below that bandwidth (86.4GB/s and 80GB/s respectively) they can suffer more from it being low. The 650Ti boost is above the limit at 144GB/s, but the 770M is not (96GB/s), which is why the core count vs clock speed relation comes in better.
Using core count x clock speed x (nonexistent for kepler) shader hotclock multiplier, the maxwell cards should lose. Handily. The 770M ought to be ~19% faster, but the 770M loses in quite a few gaming scenarios and trades blows a lot in others, despite having overall stronger hardware. So core to core, maxwell should trump kepler. Above the 120-144 bandwidth threshold like the higher end cards are, we should see better core to core, clock to clock comparisons.
I probably did not clear up one bloody thing there. But I still need lots of sleep. XD.Cloudfire and LostCoast707 like this. -
so you guys think we will see specs of the actual cards before we hit 150 pages??
Mr Najsman likes this. -
150 pages?! It's only page 48 for... oh nvm I set to view 30 posts per page
-
-
-
-
LostCoast707 likes this.
-
Memory bandwidth is important only to a certain point as you say. Afterwards you require so much resources... R9 290x showed it wasn't enough having 300GB/s++ memory bandwidth for 4k, as well as 780 ti.
So for now, 1080p with these high end laptop GPUs is enough. Although mobile variants with starved memory bandwidth are available, depending on the game, it hardly makes a difference after 120+GB/s. -
Yep, GCN didn't perform to its full potential during the first year due to drivers. Then Catalsyt 12.11 and the awesome Never Settle game bundles came out and it was game over for Nvidia.
-
Anyway, I think Maxwell on its own will be better suited to higher resolutions. The R9 290 and 290X cards are a newer architecture I think, and AMD let them handle higher resolutions better than the Kepler chips. On paper, the 290X doesn't come close enough to a 780Ti (both in memory bandwidth and core performance; both cards assumed stock) to outperform it at higher resolutions. -
Different architectures, no.
Look up GTX 660 and GTX 580.
One is a 192bit, the other one is a 384bit. They perform the same. -
Yes that was implied.
Brace yourself: NEW MAXWELL CARDS INCOMING!
Discussion in 'Gaming (Software and Graphics Cards)' started by Cloudfire, Jul 14, 2014.