They've updated to say that Titan V doesn't support either SLI or NVlink: http://www.guru3d.com/news-story/nv...-support-nvlink-and-does-not-support-sli.html
-
For what it's worth, this is what nvidia is saying about Pascal vs Volta so far:
http://www.pcgamer.com/nvidia-releases-volta-based-titan-v-for-pc/
There's also the inclusion of HBM2 on the Titan V, which is the first time HBM2 is appearing on a "consumer" graphics card from nvidia. Perhaps they'll end up putting HBM2 on other Volta cards as well. @Ionising_Radiation -
Annnnnnnnnnnnnnnnnnnnnnnnnnd confirmed. It's basicially a non ****ty pascal version. I reallly hope AMD is going to release something that doesn't get priced up by miners so that this **** from NVIDIA can finally stop.
-
Did you read that incorrectly?
I'm not sure why you're mad that Volta is very different from Pascal. -
Lol just lol.
Volta I hear is coming soon. Real soon. And I'm sure amd is ready -
So what was all that ampere stuff that came out?? A cleverly timed misdirection?? Or just a bad rumor??
-
Whats wrong with Titan Volta? In fact, if it had professional drivers, this price point is amazing! For Deep Learning Researchers, this is a great price to get the same amount of tensor cores as the V100. It also have an incredible amount of double precision compute power that if supported, makes this card one of the best bargain on the market right now. You get the same DL Training performance from a 3k card as with 8x 1080 TI. 8x 1080 TI is around 4800 USD last I checked and takes up way more power and chassis space.
-
yrekabakery Notebook Virtuoso
Yup, Titan V has the best FP64 FLOPS/$ of any Nvidia GPU ever released by a wide margin.Vasudev likes this. -
Leaked NVIDIA TITAN V Benchmarks Show Volta GPU Demolishing All Competitors
"Unigine's Superposition benchmark also yielded some impressive numbers. At stock speeds, the TITAN V scored 5,222 in the 8K preset, and 9,431 in the 1080p Extreme preset. The latter is particularly interesting—famed overclocker Kingpin had previously taken a GeForce GTX 1080 Ti, stripped off the heatsink and bathed the card in liquid nitrogen (LN2), and overclocked it to 2,581MHz, which resulted in a score 8,642 in the 1080p Extreme preset. The TITAN V scored nearly 800 points higher."
"Of course, NVIDIA is not in a rush to bring Volta to the consumer market, as AMD has not fully caught up with Pascal (Vega comes close). The silver lining to that is it gives NVIDIA time to tweak things and flesh out better drivers for when Volta does infiltrate the mainstream gaming sector. Based on what we've seen here, we can hardly wait." -
Exactly.... People dont realize this card isnt meant for pure gamers.... -
So this "ampere" talk is perhaps the compute nerfed consumer version with the usual 1/32 dp/fp64?
-
yrekabakery Notebook Virtuoso
Yup, if the lack of SLI support wasn't enough of a sign. Gaming cards start from Gx102.
Could be, with GDDR6.tgipier likes this. -
And from the RED camp... AMD quietly made some Radeon RX 560 graphics cards worse
-
Robbo99999 Notebook Prophet
A pretty cool article by Gamers Nexus on the gaming performance of Titan V (Volta of course):
https://www.gamersnexus.net/guides/...marks-async-future-is-bright-for-volta/page-2
Their main focus here is in which type of gaming loads does Volta differentiate itself from Pascal: well basically DX12 & Vulkan (the lower level API's), performance is improved over Pascal by about 40% when comparing Titan V to Titan Xp (Pascal). Suggests we're gonna see about a 40% performance improvement with Volta over Pascal, but only in new titles leveraging new technologies, not in DX 11 where Volta has only a small lead of I think less than 20% if I recall correctly.
So far, not a worthy upgrade for my GTX 1070, I'm likely to wait for a larger performance gap, possibly whatever comes after Volta is when I'll upgrade (probably the xx70 or xx80 version of that architecture).hmscott likes this. -
It's definitely not worth the upgrade. Just be glad you bought pascal. Currently I'm stuck with a 970m...its not even worth the upgrade for me as I play every well almost every game either maxed or very high. One or two on high settings. And at 1080p. What's wrong with high settings
ExMM and Robbo99999 like this. -
Robbo99999 Notebook Prophet
Yeah, good point, a lot of times Ultra settings don't look that different to Medium or High settings, yet cost a lot of fps. At the moment my GTX 1070 is pretty much running all the games I play at over 100fps on Ultra settings, so there's room for future games to get more demanding without making my card obsolete, and even more so if I turn down a few of the settings. -
Before I sent it off to Asus for RMA, I frequently played games on High (saw no difference between it and Ultra honestly) and limited my GL702ZC RX 580 frames to 60 because the panel is limited to 60Hz and going over that would result in wasted frames... and it has FreeSync, so, there was no point going over.
60 FPS at 1080p is more than enough for me in terms of smooth gameplay, plus I don't push my GPU needlessly and it should consume less power too (which is a sensible thing to do on a laptop).
High settings plus 60FPS limit (or your panel/monitor refresh rate limit) seems like a good combo.
Not sure if there will be a need for me to upgrade this laptop anytime soon - except maybe the CPU to replace it with a Ryzen refresh and/or Ryzen 2 and 3.
-
You guys are right about settings. And I don't feel the need to upgrade just yet.. Maybe in a few more years.. 2...but dang this GPU is 4 years old. I could be wrong.
-
Guys... I have some bad info probably (still speculation) http://www.guru3d.com/news-story/nv...on-geforce-gxt-2080-clearing-some-rumors.html
Vasudev and Robbo99999 like this. -
Robbo99999 Notebook Prophet
One thing's for sure, if new GPUs come out on the market while GPU prices are still as high as they are now = not worth it! I think it is down to the manufacturers to try & stop this ridiculous inflation of GPU prices due to mining demands as it is damaging their longer term gaming business - like that article says they could create dedicated mining cards and gimp mining performance on consumer gaming cards somehow, that would help.Vasudev likes this. -
I don't think there will be a price drop when the generation is out.
Robbo99999 likes this. -
Robbo99999 Notebook Prophet
I don't think so either, unless they release mining cards or the mining craze dies down, I don't think there's much sign of that happening. -
Has it been announced yet?
I need more GPU POWAH !!!! -
Robbo99999 Notebook Prophet
Nope, nothing has been announced from NVidia, I don't think anyone really knows what kind of form the next generation of GPUs will take, although I hear a lot about it being Pascal but just on a die shrink and with GDDR6 - mainly because Titan V Volta was not very good at gaming for it's size & expense (tensor cores not useful, etc).Georgel likes this. -
ThePerfectStorm Notebook Deity
I hope for something like Ampere / Turing that is a cooler architecture than Pascal, but then again I think mine is an optimistic viewpoint.
Sent from my iPad using Tapatalk -
Robbo99999 Notebook Prophet
I'd imagine the running temperatures are gonna be related to how far they feel the need to push up the core clocks using greater amounts of voltage. If NVidia are not under pressure to squeeze every last drop of performance maybe they will be able to keep the clocks & voltage lower & operate it at a more efficient place on the voltage/Mhz curve. AMD really messed up that aspect of their Vega cards - they had to push up the clocks well beyond their point of efficiency to try & compete on a performance level. Praps the next generation from NVidia will be cooler because they're not under any pressure from AMD at all - just a theory I've concocted right now! Such an approach would then let them re-release rebrands of the same architecture at later dates with higher clocks and more performance - kind of like how they went from 600 series Kepler to 700 series Kepler.bennyg likes this. -
Only good news for us, is that mining has not adversely affected the pricing of the gaming laptop market. Ampere laptops should follow suit with what we're used to paying.
Robbo99999 and ThePerfectStorm like this. -
NVIDIA Turing GTX 2080/70 GPUs Allegedly Launching in July for Gamers, Ampere to Succeed Volta in HPC Market at GTC
https://wccftech.com/rumor-nvidia-t...eforce-lineup-ampere-to-succeed-volta-in-hpc/
Fresh off the rumor mill we have another wild one. According to Igor Wallossek of Tom’s Hardware Germany, NVIDIA is allegedly launching its brand new Turing graphics architecture in July for the gaming segment. Igor, who’s the same individual that leaked NVIDIA’s Ampere codename last year, claims that there’s been a switcheroo of sorts.
NVIDIA is now said to be planning to replace its existing Pascal powered GeForce gaming lineup with a brand new graphics architecture code named “Turing” around July. “Ampere” on the other hand is now said to be a Volta successor in the compute, HPC & AI / machine learning markets.
NVIDIA Turing Architecture is Designed For Gaming, Allegedly Launching in July – Ampere to Succeed Volta in Compute Markets by End of March
This switch is somewhat unintuitive as Alan Turing has been famous for his work in the fields of computer science and cryptanalysis. Which makes the use of his name more suitable for a graphcis architecture that’s targeted towards professional applications, rather than gaming. Although, this latest update comes straight from the same source that leaked Ampere to begin with, so we’re inclined to believe that it has some credibility.
English computer scientist, mathematician, logician and cryptanalyst Alan Turing
Ampere which has reportedly been in production since late January is expected to be revealed at GTC later this month between the 26th and 29th. Turing on the other hand will only enter production in June with an announcement expected that month, at Computex in Taipei Taiwan. A hard launch is said to follow a month later in July.
This suggests that NVIDIA’s alleged GeForce GTX 20 series lineup, including the GTX 2080 and 2070 will not actually be released later this month, but rather debut in June and hit shelves in July. Whilst a successor to the Volta V100 accelerator based on the Ampere architecture is what we’ll probably see an announcement for later this month at GTC.
Regardless of codenames, the facts around the hardware specifications of the new GeForce series are still the same. We’re still looking at a GP104/GTX 1080 class chip replacement in the summer, built on TSMC’s 12nm process technology and featuring 16gbps GDDR6 memory from Samsung.
RELATED Rumor : NVIDIA GTX 2080, 2070 Ampere Cards Launching March 26-29 At GTC 2018
As always, remember to take this rumor, as with any other, with the necessary dose of NaCl. We should get a much better idea of what NVIDIA’s GPU roadmap looks like at GTC later this month. It seems the rumors are pouring in and won’t stop anytime soon. I’d urge extra caution in the meantime especially when dealing with conflicting information like what we’ve been seeing lately.hmscott, Georgel, Vasudev and 1 other person like this. -
Rumor : NVIDIA GTX 2080, 2070 Ampere Cards Launching March 26-29 At GTC 2018
https://wccftech.com/rumor-nvidia-gtx-2080-2070-ampere-cards-launch-march-gtc-2018/
NVIDIA is reportedly readying a brand new lineup of GeForce GTX graphics cards based on its upcoming 12nm Ampere graphics architecture for an official debut at this year’s GTC in late March, a mere month away from today.
Ampere has been a frequent subject of leaks and rumors in the past several months. The code name hasn’t been publicly disclosed by NVIDIA in any of its previous roadmaps, however several reports have emerged alleging that Ampere will in fact power the next generation lineup of GeForce GTX graphics cards this year. In essence NVIDIA is said to be skipping a generation in the gaming market, Volta, in favor of Ampere. Whether Ampere is a stand-alone leapfrog architecture or a tweaked version of Volta that’s tuned specifically for gaming is still unclear.
NVIDIA “Ampere” for Gaming, “Turing” for AI & Compute
Two key features of Ampere include TSMC’s new 12nm manufacturing process and Samsung’s new 16gbps GDDR6 memory. The first Ampere GPU that NVIDIA is expected to announce next month is GA104, a replacement to GP104 ( GTX 1080/70 ) in terms of chip size and market positioning and a replacement to GP102 (GTX 1080 Ti) in terms of performance.
So, one could reasonably expect a pair of products with pricing similar to the GTX 1080/70 and performance similar to the GTX 1080 Ti/ Titan Xp. These will be the purported GTX 2080/1180 and GTX 2070/1170 cards. According to previous reports, GP102 has already entered end-of-life and is no longer being manufactured, while production of GA104 has been well underway since January.
I will also be remiss if I didn’t mention that the naming scheme of the brand GeForce GTX Ampere series has yet to be confirmed, and that “2080” and “2070” are simply placeholders at the moment. Whether the company will follow a 20 series or 11 series naming scheme is yet to be determined. Although, with the series coming out in mere weeks we shouldn’t have to wait too long before we know for sure.
English computer scientist, mathematician, logician and cryptanalyst Alan Turing,
On the compute front the company is expected to debut an entirely new graphics architecture called Turing that’s specifically designed and optimized for AI and machine learning applications.
Previously we’ve seen the company use the same architecture with chip specific tweaks and optimizations for different markets e.g. FP64, HBM and other component configurations that would differentiate the gaming products from the compute products.
If what we’ve been hearing is accurate, then 2018 will be the first year where NVIDIA is going to debut two distinctly different graphics architectures for gaming and compute rather than just tweaked chips based on the same architecture.
GTC 2018 will officially kick off on March 26th and will ends on March 29th, the announcement will likely come at Jensen’s keynote speech.Until then remember to take this rumor, like any other, with a grain of salt.
View attachment 155921 -
Robbo99999 Notebook Prophet
As you can see from those two articles, the names for the gaming architecture vs compute/professional architecture have swapped places (Ampere / Turing) - I think that's strange, and casts doubt on the solidity of these rumours. The launch dates have also changed in those two different articles (from the same website). I'm of the opinion that these rumours are not solid and that new cards are not quite as near to launch as these articles suggest (ie. the March to Summer time frame mentioned in these articles). Having said that, NVidia have already stopped production of Pascal - I'm sure I read that, and they can't just run out of stock and not sell anything, so that would be a plus that would marry with a March to Summer launch of the new architecture? EDIT: just googled it right now: GP102, which is used in the GTX 1080ti has stopped production, so they're probably still making GP104 for the GTX 1080, although strange that they stopped GP102 still unless something new due to come out. Ha, I'm flipping between what to believe, but I think I'll hedge my bets and say that the new gaming architecture will be launched sometime this year, gotta be at least sometime this year!Last edited: Mar 5, 2018 -
I'll just wait and see, I can't expect what will happen with all of that gray conflicting information.
hmscott and Robbo99999 like this. -
I really hope that the new cards will not cost an arm and a leg, and that they will be something like the typical 1080 being 30% quicker than the current 1080ti
hmscott likes this. -
well, actually, they didnt really switch names for the lineups!
think about it: turing continues the trend of pascal, both being famous scientists in their respective field.
as for ampere, it sticks to a unit of electricity / power, thus being the successor to volt(a)
so depending on how you interpret the names, it can actually make sense what the rumours suggest!
Sent from my Xiaomi Mi Max 2 (Oxygen) using Tapatalk -
Robbo99999 Notebook Prophet
Ha, yeah, but only if the initial rumours didn't explicitly contain information showing/proving one name to be related to gaming & one to computation - in which case it was up to 'us' to determine which name went with which architecture. (I don't know the original source of the rumours).Georgel likes this. -
Robbo99999 Notebook Prophet
This is interesting, after reading this article ( http://www.guru3d.com/news-story/nvidia-announces-rtx-technology-a-raytracing-api.html) it sounds like the next generation of NVidia gaming GPUs are gonna be including tensor cores - i.e. will be Volta like the Titan V. There had been quite a bit of thinking that the gaming versions of cards would not include tensor cores and be named as a different architecture, but this article explicitly mentions the use of tensor cores in accelerating ray tracing to be used in future games on the next generation of gaming cards (Titan V onwards).
-
It's unlikely those Volta + Tensor GPU's would be bought by Gamers, they are $3000 or more
IDK what a 1060 or 1070 + Tensor GPU would look like, or cost, but it seems unreasonable to ask gamers to purchase a special GPU hardware feature for just a Developer API Raytrace feature added to a game. That would be like charging extra for HairWorks hardware, which in a way, Nvidia does.
But the cost increase for Volta + Tensor is insane.
It still seems like Volta + Tensor is firmly in the Developers Realm.Last edited: Mar 20, 2018 -
Robbo99999 Notebook Prophet
The article talks about the ray tracing being an end user feature, ie running in real time on gaming GPUs, on the next generation of gaming cards - so they're not gonna be the $3000 cards that you're talking about, just the regular Geforce cards, i.e. the next generation of GTX 1070/1080/1080ti cards.
I kind of agree with your middle paragraph, but it's normal in a way for new hardware and new games to require new hardware for best experience.
The main takeaway implication from the article though was that next generation of gaming cards will have tensor cores - heavily suggests Volta rather than the supposed tensorless Ampere/Turing proposed names for the next architecture. -
Yeah, I know that Nvidia will push for Raytracing + Tensor hardware in games to try to elevate this needless feature to gamer status to lock in buyers, but hopefully it will fail.
It would have to be in every game, and I can't believe it would be feature that would spur gamers to spend $3000 for a GPU.
Screw PC gaming if that's the case, I'll get a damn console and enjoy gaming at 30 FPS
-
and i was done 2 weeks ago....more will follow suit
bought a xbox one s and a killer 4k qled tv.
so far having more fun.....next up .... xbox one xhmscott likes this. -
A Console That Can Beat A Gaming PC? - WTF!
Published on Mar 22, 2018
Could A Console Take On The So Called PC Gaming Market?
$60,000 worth of GPUs power Unreal Engine ray tracing demo
We all know this system could easily run Crysis!
By: Anthony Garreffa | Gaming News | Posted: 2 hours ago
https://www.tweaktown.com/news/6129...wer-unreal-engine-ray-tracing-demo/index.html -
cj_miranda23 Notebook Evangelist
So basically what he wants is a console that act like a pc for 499$ max?hmscott likes this.
Volta: NVIDIA's Next Generation GPU Architecture (2017-2018)
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Aug 14, 2016.