What's the highest power draw you have been able to generate on the 1080 so far? Have you hit 180w+ ?
Anyone else?
-
-
Oh I'm well aware of the clock wall, its just asinine to think these chips are going to be in Clevo machines.......hmscott likes this.
-
mason2smart Notebook Virtuoso
Yea but it weighs like 5.5 kilos and needs 2 psu's -
mason2smart Notebook Virtuoso
They could make one 600 watt psu...
-
mason2smart Notebook Virtuoso
Those 330 watt psu's are quite old -- companies have been using them for a while and they could probably be redesigned to be smaller and carry more wattagehmscott likes this. -
Not going to happen when it would require a whole new power brick. Much cheaper to use two currently existing bricks. Not to mention that a 600W brick would not fit in any backpack while 2x330W do if you have a big enough one (my Targus XL can fit two just fine).Georgel, hmscott and mason2smart like this.
-
It's a balancing act to provide that kind of power, yet provide portability and use away from the Hydro-cooler.
Splitting the power across 2 power supplies lets you carry a relatively light 330w power supply in your backpack, and leave the other one plugged in to the Hydro-Cooler.
Away from the cooling you won't want to or be able to run at full OC on just air cooling, so the single 330w PSU will suffice.
Off the Hydro-cooler it's going to run at 980m SLI speeds, like your GT80S, hopefully with some OC capability - a single 330w PSU was enough for my stock vbios/BIOS OC'ing SLI 980m's.
Hopefully other makers will come out with similar set up's to give desktop 1080 SLI performance when at home, and detuned performance on a lighter carry around PSU when away from home.
There are physical limits for power distribution, we still have huge transformers for power distribution even after 100+ years of optimizations.
I don't think a good reliable 660w PSU would be small enough to carry around easily in your backpack
Georgel, mason2smart and Ethrem like this. -
I build my PC with a 1000w powersupply for single GPU solutions only for a good reason.
I am expecting 350-450w load on the CPU and 400-500w load for GPU. -
Put tapes around 2 330W ones and you're good to go.
-
It looks like a mis-communication, it's really a 970m SLI...
"UPDATE: The Aorus X7 Pro actually features dual GeForce GTX 970M in SLI, not dual GTX 1080M, and the 17-inch display runs at 3K 120Hz, not 4K 120Hz. A GIGABYTE representative reached out to us to confirm the specs, and apologies for the mix up--we were going off of reports from another source."TomJGX, Ashtrix, mason2smart and 1 other person like this. -
what on earth are you building that will draw that much wattage with a single gpu?
can you post your parts list? plz
also you drives & mobo will take up some wattage as well
iv got a 750w what in the build below but that was only encase id wanted to go duel gpu at some point. it only draws 470w at max stock clocks but even o'cd the wattage wouldn't take it up that much
http://uk.pcpartpicker.com/list/GL7NnnLast edited: Jun 3, 2016 -
mason2smart Notebook Virtuoso
With so many manufacturers pulling over 330 watts it would make sense to create a new psu. The one they are using was designed prob like 5 to 6 years ago and could probably be made more portable. For example, look at the psu provided for the new evga sc17
-
I am expecting overclocking a GP100 eventually possibly on water. I know heavily stressed GM200 can easily hit 400-500w overclocked.
5960x are notoriously power hungry overclocked(expecting 400w), and yes, I know.
http://ca.pcpartpicker.com/list/PvdhPs
You will be surprised how much power GM200 takes with unlocked vbios. Also remember, 6/8 Pins are capable of much more than they are rated for.Last edited: Jun 3, 2016 -
You could always go all out like this
http://pcpartpicker.com/list/YhFD4C
-
I am waiting for LGA 3647 with skylake E. Hexa channel memory/optane is worth the upgrade. Also very possible 12 core LCC die since I know HCC skylake E is around 28 cores.
Also 48 native lanes. Perfect for native x16 SLI plus PCIE SSDs. And my 5960x is less than half of what a 6950x cost, for 80% of the performance.Last edited: Jun 3, 2016 -
I was just joking LOL. I would never suggest someone get a machine like that... PCI-E SSD of all things...CaerCadarn, tgipier and hmscott like this.
-
That was my original plan, except I had to cut budget for GP100 fund LOLOL. And also plan on significantly expanding my display real estate pass the single 34inch ultrawide.
-
Thinking of leaving the U3415W as my main screen and getting a 100hz+ 1440p gaming monitor.
Or screw that and get another 40inch 4k monitor. Because TV=monitor. Obviously.Ethrem likes this. -
TVs make horrible monitors... So many apps don't scale. 150% scaling is the only way this thing is usable for text but it works great for games which was the main reason I built this desktop anyway so I guess its not all doom and gloom.
-
Oh boy, that system looks like what he originally wanted xD
tgipier likes this. -
Its a 4k monitor at 40inch, its around 110 DPI. Fairly standard.Ethrem likes this.
-
D2 knows about my insane shenanigans. And how my current system is the budget version.Ethrem likes this.
-
mason2smart Notebook Virtuoso
Unless it's a 65 in 4K 3D Sony smart tv at 120 HZ
This is kind of getting off topic...Ethrem likes this. -
At 1.025V and 2.025GHz nVidia reports a TDP of 120% at times when running FS 1.1. Haven't actually measured it physically though.hmscott likes this.
-
its gonna be hard to run that chip past 250w....
It really looks like Pascal overclock is extremely sensitive to temperature, not voltage. We already have seen this in GM200, which adding voltage doesnt help much but lower it to 30c- does help with stable clocks. Hence my interest in hybrid and possible WC blocks later.
LN2 overclock achieves around 2.4ghz at the 1.25v lock point I believe. Where on air/water is around 2.1?ghz.hmscott likes this. -
Mine is a bit more modest... http://pcpartpicker.com/list/fNBBD8
They didn't have the Maximus VI Formula on the part picker which is what I have, not the Extreme.
Don't know if its worth going through all that... And man G-Sync monitors are stupid expensive..........Georgel, Ashtrix, hmscott and 1 other person like this. -
fair enough the 5960x draws ridiculous amounts oc'ed that will be a cool rigEthrem likes this.
-
Why do you need dvd driver?
And dell 144hz monitor???? Dell ultrasharp is the only one I am willing to buy.
I am looking at korean 1440p. Not a huge fan of g sync t h. Nice yes, worth the tax? -
It looks like even on water-cooling, updated vbios controls, and careful tuning, 2050mhz is the best you can get, which matches FE air-cooled results.
There have been some over 2100mhz recorded maximums, but it doesn't hold while gaming.
Most get 1950mhz-2050mhz range of clock under gaming.Ethrem likes this. -
I heard of 2.2ghz upper range. They are very possible.
If the top OC is 2050mhz. 1080 flat out dont interest me, not one bit. -
There seem to be one-off's that reach 2100-2200, but I don't think they are game stable either. They are just "OMG!! Look at that!!" results, fleeting and not usable.
OC's over 2050mhz don't seem to translate into FPS improvements, as the Boost Clock 3.0 takes over and modulates the clock.
It's not a bad OC for a first Pascal board, it should hold us until 1080ti comes along
-
Hence I am waiting for pascal bios editor. Where you can manually force every boost bin to be the clock/voltage you want.
As in you can disable boost completely and lock in 2.2ghz. I hate boost clocks and I wish I can lock voltage and clock to be consistent in all scenario. Including all the idle clocks to max overclock. -
A bluray burner would be nice to have.
The Dell was cheaper than the ROG Swift and has better reviews. Maybe I will just skip G-Sync... saves a LOT of money.
I also edited the FE pricing down to the 599.99 that I would be willing to pay.
Think its worth keeping my current machine and upgrading it like that though? I'd love to have NVME drive but unfortunately my board only supports M2 NGFF drives
-
Skip the blueray, you have a ps4.
Skip g sync and look into ultrawide imo. Unless you are pro gamer.
Spend a bit more on a better cooled 1080.
Honestly not crazy for nvme. I need reliable SSDs, i dont need fastest SSD. -
For the new Broadwell-E I would go for a new MB, probably one issued in conjuntion with the Broadwell-E release, there are a few at the top end that add PCIE lanes beyond those offered by the new CPU's, supporting simultaneous use of extra PCIE devices while keeping use of SATA III ports.
I haven't been keeping track of the specific models, Titanium comes to mind, as I saw them mentioned during 1080 GPU coverage videos I have posted. -
You want a Blu-ray 4k reader at least for the new releases, but I have found the BD writers are just too slow for such large amounts of data.
The rest I agree with
-
PLX chips can be troublesome imo. And he is still staying on lga 1150.
Edit: less trouble, more not worth it anymore on HEDT. You have 40 lanes. And multigpu outside of 2 is dead.Last edited: Jun 3, 2016Ethrem likes this. -
Ultrawide eh? Hmm... I'll look into it.
Bluray burner is for burning, not for watching. I would of course use my Xbox or PS4 for that. There are times that I miss having an optical drive as stupid as that sounds in the age of USB sticks that rival SSD speeds but it is what it is.
I want a hybrid cooling solution, probably from EVGA, if for nothing else but keeping the noise down.
I'm still not sure if its going to be 1070 or 1080 though. Maybe I'll just get the monitor in August when the credit card is paid off then go from there. The 780 Ti is still serving my needs quite well - actually better than it ever did before - I suppose I can wait a bit. Not sure about the gaming keyboard and mouse though. I mostly just use the Xbox controller lol. -
better to hold of AMD might pull something out worth picking up & even if they don't it should drop nvidia's prices
-
So far, dont look like it. Polaris is a disaster.
-
I'm still holding out hope for Vega but I doubt Vega is going to be out when I'm ready to buy in September.mason2smart likes this.
-
well they have only posted the r9 480 the should be at a few cards above this & as i said it should bring down nvidas prices
-
Polaris doesn't overclock well so I doubt we will be seeing much more short of driver optimizations.
nVidia dropping their prices doesn't seem likely either considering that the 1080 smashes the RX 480 in a benchmark that has been optimized extremely well for AMD's GCN architecture, specifically async compute.hmscott likes this. -
mason2smart Notebook Virtuoso
Talking about a sync, does pascal do a sync or is it emulated via software still? -
Here's the problem that I have with this. AMD chose to use the benchmark anyway. They could have used another benchmark if they were that worried about it but obviously they chose not to which tells me that in other scenarios, nVidia would have smashed the cards. AOTS has always been an AMD optimized title and while I can see the difference now that its pointed out, that particular performance boost still wouldn't have cost nVidia much in a performance hit.
Short answer - no.
http://www.mobipicker.com/nvidia-gtx-1080-directx-12-benchmarks-async-compute/
nVidia simply didn't have the time to implement what AMD has been doing for years. AMD had a game plan, a very well executed game plan, and both DirectX 12 and Vulkan are built off of AMD's work in async compute. We won't see full async compute ability until Volta.mason2smart, Georgel, hmscott and 1 other person like this. -
It's something pixel now previous implementation was other way maybe frames I guess, still not full HW. But the Pascal can handle the ASync with it's higher clock speed, Maxwell could do it too but not very powerful. Still the Fury X can't beat Maxwell in DX11. The polaris aiming at the low end is like surrendering without a war, always meh (they seem like happy with the console partnerships), Hope Vega brings some true stuff against the beasts GP100 /102 else RIP AMD.
On a side note Nvidia cards pull the tessellation in games quality wise while the AMD stuff's more aimed at numbers, I prefer the NVidia's way that was long time back idk how AMD does it now.Last edited: Jun 3, 2016hmscott likes this. -
Mine is about this http://pcpartpicker.com/user/D2ultima/saved/8Pw48d
That's IF I had to build my dream PC. Add a GTX 1070 to that, as well. Replace the Titan X cards with two GP100 pascal top end cards later, but those aren't out yet.
What are you talking about? Maxwell & Pascal can do Async just fine. Maxwell is limited to 31 queues + 1 graphics queue. AMD can do 64 queues on Hawaii and Fiji. Pascal should be better than maxwell, but apparently not much. Even AMD said that 64 queues should be too much for a game.
That's what annoyed me even more, when they used that game to supposedly push their cards' prowess. Because they know that's a special case.Georgel, Ashtrix, JAY8387 and 1 other person like this.
Pascal: What do we know? Discussion, Latest News & Updates: 1000M Series GPU's
Discussion in 'Gaming (Software and Graphics Cards)' started by J.Dre, Oct 11, 2014.