im a day trader. keeping money tied up in a stock for a long time with small gaines isnt worth it. i simply should have waited instead putting in a magical number
-
PrimeTimeAction Notebook Evangelist
-
Kade Storm The Devil's Advocate
Even a single GPU from your system would give it a solid run. In a nutshell: tragic state of affairs.Last edited: May 15, 2016 -
No, not a run for its money. I'm not that silly. A single 780M can't compete with Polaris 10.
I also think I figured out what Polaris 10's problem is. They're the low end line. AMD's launching their low end line first. They're only meant to be better alternatives, maybe a bit stronger the 380/380X/390/390X cards but at much lower TDP.
Vega 11 seems to be what AMD needs to take up their midranged GPU slack and to toss into mobile. -
Vega 11 is their big die contender, I *highly* doubt they will put it in a laptop any time soon. They need those chips to get back the real money and that's not in gaming.
-
I thought Vega 10 was their top end stuff?
-
By the time Vega hits mobile, Nvidia will have its Pascal chips locked in with every major gaming laptop manufacturer; MSI, ASUS, Dell, Alienware, Clevo, Acer, Gigabyte, AORUS. Did I miss any?
Why would they even bother offering AMD based machines, that late? Maybe Dell (Alienware) and Clevo will add some of the chips as an option in the configurator, but that's about it.
The war is over without a shot fired.
Well, AMD does still have the Apple Macbook contracts. -
Rumor puts Vega 10 as the Grenada successor and Vega 11 as the Fiji successor.
-
moviemarketing Milk Drinker
Some new details on GTX 1080 from videocardz.com, half of which is over my head. Hopefully someone smarter can 'splain it to me
Last edited: May 15, 2016Ashtrix, Kade Storm, Mr Najsman and 3 others like this. -
I wonder whats the actual technique used in the "memory compression" and how much things like color quality etc suffered because of it.
D2 Ultima likes this. -
If AMD could get that performance at a significant perf/watt ratio they may be able to get design wins on that basis. I'd take 970M performance at reduced wattage.
Sent from a 128th Legion Stormtrooper 6P -
GDDR5X seems to be 8x memory speed as opposed to 4x from GDDR5.
Memory bandwidth efficiency has improved again and is apparently overally better than Maxwell. How much? We don't know. Nai's benchmark will tell us when the cards launch. But memory bandwidth numbers aren't going to tell much difference anymore. 320GB/s on Pascal should in theory be effectual to nearly HBM1 speeds of 512GB/s. On Maxwell, 336GB/s was more akin to around 384GB/s on Kepler and AMD. Since the improvement listed is an improvement over maxwell's benefits, then for pascal it'd be something like ((320 * 1.15) * <pascal improvement modifier>) = new effective bandwidth. I'd guess closer to 450GB/s or thereabouts. But as @tgipier correctly asked; what's the cost? Are delivered colours going to suffer, or is the compression efficient enough? We'll know when people who aren't afraid to report the truth about nVidia stuff review things properly.
They're claiming the normal SLI bridges are compatible with Pascal, debunking the double-slot bridge being the only usable thing, and allowing mobile GPUs to use SLI with non-rigid bridges, however there's a high bandwidth kind of bridge that they recommend for 1440p 144Hz & above and 4K and 5K. That's likely the dual-slot bridge, but then that means 2-way is your max connection speed. They should have done it XMDA-style. They had long enough and were even working on NVLink, they have no excuse in my mind. They just didn't bother, and compromised by nuking 3-way and above multi-GPU when bandwidth is important. Benchmarkers are going to hate them.
And kind of calculations that are affected heavily by ROP unit count are going to still be better on the 980Ti/Titan X/etc.
That's basically what I gleaned from it.Ashtrix, moviemarketing, TBoneSan and 2 others like this. -
Texture compression hasn't really cost much of anything in image quality, its been used for years (the S3 Savage video card introduced the S3TC standard that every major video card company ended up licensing from them). It all depends on the compression algorithm itself, its not like JPEG compression or anything. I had the S3 Savage and I could discern no noticeable difference with S3TC other than a significant performance boost (there was an ancient benchmark that specifically had the option to use it or not because a lot of video cards at the time didn't support it and it would cause texture corruption, a few games had it too). I wouldn't expect it to be noticed unless you're playing at 640x480 and staring at the pixels.
-
Robbo99999 Notebook Prophet
Thanks for the link, just like you I didn't understand all of it, but I definitely got something from it. NDA lifts tomorrow apparently, so they'll be actual videocard reviews out tomorrow! I'll be reading them on Guru3d first, then Toms Hardware & HardOCP second.moviemarketing likes this. -
PrimeTimeAction Notebook Evangelist
What's not understand. All its saying is
Maxwell = Bad
Pascal = good.
Prema likes this. -
Robbo99999 Notebook Prophet
And on same theme: Fermi almost completely dead, Kepler just about OK, Maxwell good, Pascal excellent!Kade Storm and PrimeTimeAction like this. -
HFM !!! sorry to go off topic but I could not find any other way to contact you. I badly need your help man. I have accidentally erased the BIOS on my RAZER blade 2014 can you please take a dump of the BIOS and send it to me. Please Please. Once again all the other guys sorry to go off topic. I am stuck with a bricked laptop and I need for help
-
Robbo99999 Notebook Prophet
DOH! @hfm hopefully you know this guy! -
How did you manage to erase the bios?
Sent from a 128th Legion Stormtrooper 6PD2 Ultima likes this. -
Charles P. Jefferies Lead Moderator Super Moderator
I gave you private message capabilities, you should be able to "Start a conversation" with @hfm.
CharlesGeorgel, Kade Storm, Mr Najsman and 5 others like this. -
http://www.anandtech.com/show/10326/the-nvidia-geforce-gtx-1080-preview
http://hothardware.com/reviews/nvidia-geforce-gtx-1080-pascal-gpu-review
Here we go...Last edited: May 17, 2016jaybee83, moviemarketing, Mr Najsman and 1 other person like this. -
Robbo99999 Notebook Prophet
jaybee83, moviemarketing and TBoneSan like this. -
I can't bring myself to pay 100 dollars more to get the card a few weeks sooner.
Does look great so far, though... -
Seems like this is the card for solid 1440 gaming...
But only important bit for me: No 4k@60fps on a single GPU still. Pass.Papusan and Spartan@HIDevolution like this. -
moviemarketing Milk Drinker
I suppose it depends on the game and settings options. There are some super demanding titles and/or specific demanding graphics options out there, not all of which necessarily look better at 1440p on highest possible settings compared to 4k on slightly lower settings. And this is stock clocks, so maybe wait for some improved aftermarket model?
Last edited: May 17, 2016 -
Thanks a lot Charles.
-
Super nice card for sure, curious to see how AMD compares, but will wait this one out for at least the inevitable Ti.... my 390x I picked up cheap earlier this year is still running strong
-
Robbo99999 Notebook Prophet
Well I've read the GTX 1080 Guru3d review, these are the main points I got from it:
1) The HDR feature is quite exciting when monitors start supporting HDR towards the end of this year.
2) Simultaneous Multi Projection: changes the image so that multiple monitors to be connected together at various angles without any distortion of the image - at only a very small performance cost.
3) Performance of GTX 1080 is anything from 60-90% greater than the GTX 980 depending on the game, and the GTX 1080 is good for 4K gaming at max settings in almost all games.
4) Overclocking is not working properly at the moment, due to the new GPU Boost 3.0 functionality, and they only managed a 10% performance increase in games from overclocking. They believe this will be improved in the future when overclocking tools properly support the new technology. Also, the overclocking procedure is quite a bit more complicated than before.
That's not a bad jump over the GTX 980, but I was always expecting more from Pascal. They skimped on the number of transistors in the GTX 1080, it only has 38% more transistors than the GTX 980 despite being on a way smaller process node than the 28nm GTX 980 - they made the rest of the performance by increasing the clocks by about 40% over Maxwell. I would have liked to have seen a bigger chip with more transistors, which you'd think would be allowed by the way smaller node, maybe running at slightly lower clock, which might have allowed for better power/performance efficiencies and ultimately greater performance than we're currently seeing with the GTX 1080. Still, it's a good jump over the GTX 980!Last edited: May 17, 2016 -
Some new GTX 1070 information - it'll have 1,920 CUDA cores, boost clock of 1.6 GHz and 150 Watt TDP:
http://videocardz.com/60127/nvidia-geforce-gtx-1070-has-1920-cuda-cores
http://www.guru3d.com/news-story/nvidia-geforce-gtx-1070-specifications-surface.html
So estimating from GTX 1080 numbers, this should put GTX 1070 performance somewhere between GTX 980 (~114%) and GTX 980 Ti (~90%).Mr Najsman and moviemarketing like this. -
Looks like the 1070 is the card for me if those specs are true. I have a G sync 1080p monitor with 144hz screen for my desktop so the card seems perfect for high fps 1080p gaming (judging from the 1080 results). Hopefully the 1080m is the same way- Clevo 75hz 1080p G sync screen with everything maxed out sounds great. Parity with my desktop and on the go machine would be great.
-
Some interesting "performance-per-dollar" charts from PC Perspective review of GTX 1080:
http://www.pcper.com/reviews/Graphi...ition-Review-GP104-Brings-Pascal-Gamers/Sound
GTX 1070 should do even better here. Performance estimated from specs should be about 70% of GTX 1080, but the price is just 63% (non-founder editions), so GTX 1070 should get something like 1.1 multiplier in those charts.Georgel, jaybee83 and moviemarketing like this. -
I would like an apology for all the crap ya'll gave me - predicted launch date, predicted founders edition, predicted performance...
Nah, jk. I am happy to see the 1080 is as capable as I'd hoped it would be. Now I'm too broke to buy it! What a shame.
Georgel likes this. -
Not really that impressive imo. Does scale well with clock though. I will reserve my judgement until I get my hands on a g1 overclocked card.
Also, the stupid boost 3.0 calls for pascal bios editor.Ethrem likes this. -
-
You are right on that. If not even the titan pascal have HBM2....
Should we start a volta thread yet?
Papusan likes this. -
Not the performance impression I wanted, and a bit too expensive for my tastes, but its nontheless the most powerful gpu right now.
Ethrem, D2 Ultima and Robbo99999 like this. -
Meh, not that impressed. My 980 ti will serve me fine for a while at 3440x1440. The 1070 will probably be the basis for the mobile 1080.
Sent from my VS980 4G using TapatalkD2 Ultima and Robbo99999 like this. -
I'm going to respond publicly just for a public service announcement, but I can't do this for you as the windows product key is stored in the BIOS. No one should ever do this on modern systems for that reason.
Unfortunately you should contact razer, they might be able to help you. -
Just a few moot points from the Ars -
Pascal's HDR capabilities are fun to read and amazing - 12-bit colour, BT.2020 wide colour gamut, SMPTE 2084 (Perceptual Quantization), and HDMI 2.0b 10/12b for 4K HDR.There's also hardware 4K60 10/12bit HEVC, as well as 4K60 10bit encoding, allowing the 1080 to stream 4K HDR video e.g- 4K Netflix streaming.
The Pascal still doesn't have the ASync compute but has pre-emption at pixel level comparatively better over the Maxwell's threaded pre-emption, I guess the higher clock rate does have a significant edge of Pseudo ASync over Maxwell. new tech called Fast Sync which is almost like VSync+GSync, lol marketing B$(?) since there's no free sync.
The SLIx3 needs new code from devs and the support is there with some gimping..
Anything involving SLI with more than two cards under Nvidia's driver is locked out.
There is a way around the lockout, although it is a little convoluted. Aside from having to use older, slower bridges, users will also have to download an app from Nvidia's website that generates a signature for their GPUs. That signature is then used to request a (free) enthusiast key from Nvidia, which users can then download to unlock the three- or four-way SLI function.
There's some mafia tech behind the GPU Boost 3.0 that's gimping the ability to stable OC for now and Ngreedia ran 2.xGHz OCd on Air @64C is not much viable since in their review it had went upto past 75C though better over the 980Ti @85C OCd.Last edited: May 17, 2016jaybee83, Papusan, TomJGX and 1 other person like this. -
Not yet. I am betting they'll delay Volta to 2019. According to LTT, NVIDIA said they invested just as much in Pascal as it would cost to put someone on Mars. Not sure if that's true but I can definitely see them keeping Pascal around for 3 years. Wonder how Polaris will do...
-
killkenny1 Too weird to live, too rare to die.
Inb4 it's 7GBs of of fast GDDR5 and 1GBs slow.Last edited: May 17, 2016Ashtrix, Ionising_Radiation and TomJGX like this. -
For $379, you can't argue with that. 1920 Pascal cores will still be nice, performance wise.
-
They cant afford to delay Volta to 2019. Their Oak Ridge Lab contract requires them to deliver around late 2017. We may get GV104 if we are lucky in end of 2017.D2 Ultima likes this.
-
killkenny1 Too weird to live, too rare to die.
Yeah, I'm thinking about waiting what AMD has to show, and then maybe sell my 970. -
If we know anything about this market, it's that delays are inevitable. It may also depend on what AMD brings to the table.
Contracts can be amended; new deals can be struck.Last edited: May 17, 2016 -
From what I understand so far XD
GTX 1080 beast everything so far.
GTX 1070 is about at 980 or a bit above levels.
If 1080M will be based on 1070, it will be about at GTX 980 desktop levels, which makes all laptops coming with gtx 980 desktop keep their places for a while.
Am I correct ?
-
Meh, I hope next year gets interesting. Desktop has a nice meh upgrade, mobile is the one that stands to win the most. They better cram in a full 1070 or else...
-
Nvidia cant afford to do this. They would be losing valuable marketshare to xeon phi, which is the main competitor for GP100.
The whole reason they went for a 610mm2 die on a new process was to race in before Intel can launch their new Xeon Phi. If they delay Volta, Intel would fight back in the HPC market. -
1070 should be around 980 TI/Titan X.
GTX 1070 is SIGNIFICANTLY cut from a GTX 1080. It is the most crippled x70 part yet.Georgel likes this. -
People should stop bringing titan X as relevant to the whole deal. Titan X is barely faster than a 980ti.
Nah, 1070 should be near 980ti, either above or below it but very close. 1080 is 30-35% fs
nVidia better make a full 1070 into mobile and call it 1080m
Pascal: What do we know? Discussion, Latest News & Updates: 1000M Series GPU's
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Oct 11, 2014.