The Notebook Review forums were hosted by TechTarget, who shut down them down on January 31, 2022. This static read-only archive was pulled by NBR forum users between January 20 and January 31, 2022, in an effort to make sure that the valuable technical information that had been posted on the forums is preserved. For current discussions, many NBR forum users moved over to NotebookTalk.net after the shutdown.
Problems? See this thread at archive.org.
← Previous page

    AMD Vega is here...

    Discussion in 'Desktop Hardware' started by HTWingNut, Aug 14, 2017.

  1. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    You see, if you had bothered to see the updates, you'd know that GN, as per usual, are spreading rumours that were disproven by AMD several days later. A MD's supplies ran out quicker than expected thus prices went up. It's not a price hike, it's caused by supply and demand. Original prices will be restored shortly.

    Well most people who don't care about tweaking don't care about power draw either, wouldn't you agree? The card DOES work out of box and fiddling with the voltage is only for people who care about that. The majority of users don't. Because 50W over or under doesn't matter. That's 5-10$ yearly at most. Irrelevant.
     
    hmscott likes this.
  2. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    TL: DR - blame cryptocurrency miners for the prices.
     
    hmscott likes this.
  3. Talon

    Talon Notebook Virtuoso

    Reputations:
    1,482
    Messages:
    3,519
    Likes Received:
    4,695
    Trophy Points:
    331

    Direct from the PC Gamer Vega 64 review.

    "Pascal versus Vega meanwhile isn't even close in performance per watt, using over 100W (more than 60 percent) additional power for roughly the same performance."

    "It's almost laughable how much power you can draw through the RX Vega 64. In Rise of the Tomb Raider, the system used 585W, and I noticed a few peaks in other testing of over 600W. To put that in perspective, you can run a pair of GTX 1080 cards in SLI and still use less power. The RX Vega 64 in my maxed out overclock is drawing well over 350W of power, all on its own.

    These are direct quotes from their review. I thought I'd quote someone else since you said my previous reviewer got it wrong when talking about power consumption. To some power draw does matter as hey may be building a mini ITX and all that heat has to go somewhere and will cause other components in a system to heat as well. Just because the core is running cool doesn't mean the rest of the system and room don't have to absorb all that heat produced by the cards huge power draw.

    Also GN got a follow up answer from AMD concerning their statement on prices and additional stock shipping soon."

    They said they cannot confirm the prices of the new stock because they cannot control what retailers do. They could have easily put to rest the rumors by simply stating our MSRP is xxx and the retailers are selling it at whatever they deem the market demands. Instead they danced around the question again.

    Regardless of who is setting the prices or what it's actual MSRP is, the bottom line is that it isn't as cheap as they stated. Therefore does not make it the "new king under $500".
     
  4. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    To be fair - AMD cannot be held accountable if Amazon decide to sell Vega for 1000000$.

    Regardless, I am talking strictly Vega 56 vs 1070. Vega 64 is simply NOT good. Vega 56, however, is clearly and objectively superior. Nearly all reviewers came to that conclusion.

    Might I add - PC Gamer also recommended people buy i3s over the G4560 at one point. I mean, I personally don't follow them for Hardware but to each their own.
     
    Vasudev and hmscott like this.
  5. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    The Vega 64 is great for a lot of things, except gaming.
     
  6. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    If you mean frames per watt - yes, it simply cannot compete. If you mean raw performance, it's just as good as a 1080 in gaming.
     
    Vasudev and hmscott like this.
  7. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Frames per watt exactly.

    Although it doesn't mean much in the grand size of things, you would still need an adequate PSU to drive it.


    Oh well. V56 is the way to go for now, if gaming is all one cares about.
     
    Vasudev and don_svetlio like this.
  8. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    The extra draw on an outlet might, depending on the age of your wiring and other devices plugged into the same circuit or outlet.
     
  9. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Correct me if wrong but shouldn't most outlets be capable of 1kW?
     
  10. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    If it's modern and up to code. Also the total circuit load does have an effect on outlet performance. At my last apartment when I powered up the rig or turned on a vacuum the lights would flicker.
     
    don_svetlio and hmscott like this.
  11. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    It depends on the wiring (gauge) and circuit breaker, but given 15amp it's about 1500w continuous usage (1800w peak) and 20amp breaker it's about 2000w continuous usage (2400w peak), and higher amp breakers with heavy gauge wiring can exceed 5000w:

    120V (US) (@ 80% max load)
    (50ft run or less)

    Gauge Amps Watts
    #16 9 1080
    #14 12 1440
    #12 16 1920
    #10 24 2880
    #8 32 3840
    #6 40 4800
    #4 48 5760

    Basic wire sizing guide for US 120 and 240 volts
    https://www.icmag.com/modules/Tutorials/ElectricalSafety/1655.htm

    Choosing the Proper Gauge of Wiring for Your Next Project
    http://www.ebay.com/gds/Choosing-th...r-Your-Next-Project-/10000000177634232/g.html
     
    Last edited: Aug 30, 2017
    don_svetlio likes this.
  12. Vistar Shook

    Vistar Shook Notebook Deity

    Reputations:
    2,761
    Messages:
    1,256
    Likes Received:
    1,362
    Trophy Points:
    181
    Vega might not be that competitive in the gaming market or efficient for that matter, but after some tweaks, it excels at mining, potentially achieving 43% more hash rate with a meager 13% increase in power consumption compared to polaris. This is supposedly though, since it is hard to believe that someone managed to get such high hash rate with such a low power draw from Vega. So although it sucks that mining is putting pressure on prices, at least Vega will be selling relatively well, and perhaps the extra revenue will help AMD stay in the game for the next generations of GPUs.

    https://www.techpowerup.com/236748/rx-vega-achieves-43-mh-s-130-w-in-ethereum-mining
     
    Last edited: Sep 4, 2017
    CedricFP and hmscott like this.
  13. CedricFP

    CedricFP Notebook Evangelist

    Reputations:
    49
    Messages:
    517
    Likes Received:
    298
    Trophy Points:
    76
    God, mining is such a waste of power; it's basically putting a price on entropy.

    Slowly but surely we're contributing to the entropic heat death of the universe. Our kids should be so proud.

    It'd be nice if proof of work wasn't the only trusted mechanism (and if proof of stake wasn't a rich-get-richer mechanism) because there has to be a better way than dumping heat constantly into the atmosphere.
     
    hmscott and Vistar Shook like this.
  14. Vistar Shook

    Vistar Shook Notebook Deity

    Reputations:
    2,761
    Messages:
    1,256
    Likes Received:
    1,362
    Trophy Points:
    181
    Perhaps the next step in China, will be to harness the heat from the mining farms, for something useful, such as heating homes...or cannabis grow rooms...haha.
     
    Last edited: Sep 4, 2017
    CedricFP and hmscott like this.
  15. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Use it to power a Stirling engine, get a little back. :)
     
    Vistar Shook likes this.
  16. CedricFP

    CedricFP Notebook Evangelist

    Reputations:
    49
    Messages:
    517
    Likes Received:
    298
    Trophy Points:
    76
    Haha, yeah, but we can't centralize the heat generation. Now, what we *could* centralize is the processing power harnessed by miners all over the world.

    Instead of mining completely useless blocks of information, mining could contribute to some beneficial folding, or heck, even SETI whatever.

    Just anything actually useful is better than wasting all of this power and GPU "effort" into solving useless bits of information all in the name of greed.

    The power miners as a whole waste, and the carbon footprint they contribute, is actually jaw-dropping.
     
  17. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Okay, does anyone here know if Vega Frontier Edition and RX Vega 64 are the same exact GPUs? Here's why I ask this...

    My friend said the hashcat developer says the two GPUs are different and that Vega FE is not "true vega". This does not make any sense to me. They perform the same in games (as long as you use the latest drivers for both and increase fan curve on FE).

    I can't find any info on any differences. He said the hashcat developer said this in IRC. Would love to see some evidence of either direction.
     
  18. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Perhaps it was Alien's that introduced crypto-mining to the Earth, to siphon compute cycles away from the Search for Extraterrestrial Intelligence... :D
     
    Vistar Shook likes this.
  19. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    Same hardware, different drivers.
     
    Vistar Shook likes this.
  20. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Vega FE is basically the Beta version. Assuming all features are unlocked, should be the same. Though I'm not certain whether every single feature is available.
     
    Vistar Shook likes this.
  21. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    I've held similar position for a while, but value is as always in perception, and as long as there's that bubble of people who want it to have value and back it up, it's not going to change.
     
  22. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    Apparently Vega FE doesn't have the same instruction sets as RX Vega. RX Vega has more and performs better in certain compute tasks, but I have yet to see AMD or someone else mention that they are different and Vega FE is missing some stuff.
     
    Last edited: Sep 6, 2017
    don_svetlio likes this.
  23. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    Also 16GB of HBM2 memory on the FE.
     
  24. ssj92

    ssj92 Neutron Star

    Reputations:
    2,446
    Messages:
    4,446
    Likes Received:
    5,690
    Trophy Points:
    581
    That's just memory. Shouldn't make a difference. I'm more concerned about the actual GPU die being different internally.

    I'm just going to assume they're the same, don't know why the hashcat developer is claiming they're different GPU chips (my friend told me this).

    It's like saying Vega FE is GP100 and RX Vega is GP102 in nVidia terms.

    I found a good deal on Vega FE and didn't want to be surprised a month or two later down the road that they're different. That's why I'm questioning all this now. :D
     
    jaug1337 and hmscott like this.
  25. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    Memory which costs almost as much as the GPU itself :D
     
  26. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Now it is, get a big enough run and the cost will come down.
     
    Vasudev and hmscott like this.
  27. jaug1337

    jaug1337 de_dust2

    Reputations:
    2,135
    Messages:
    4,862
    Likes Received:
    1,031
    Trophy Points:
    231
    And HBM3 is right around the corner.

    More HBM2 vendors are right about to enter the market (supposedly) so the price should be driven down.


    On top of all that, AMD is sucking all of the HBM2 markets so dry, that on top of the exceeding prices of the Vega cards, HBM2 prices are also slowly rising!
     
    Vasudev likes this.
  28. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    If that's right and more manufacturers are jumping on board, that's a good sign.
     
    jaug1337 and Vasudev like this.
  29. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    Last time I cared about power consumption outside of laptops was X58, we are way past that era.

    My personal opinion is that we have an option in the higher end (people debate what is high end) and dont wish to support nvidia's practices have an option now.

    Though the miners will likely soak them all up before any market target consumers can get them.
     
    hmscott and Vasudev like this.
  30. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    It's still a lot of extra heat being dumped into the room. And it probably wouldn't be as big of a deal if the gap with Nvidia isn't as large as it's ever been, historically.
     
    Vasudev likes this.
  31. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    Having had a w3520 @ 4.2ghz and a tri sli GTX 470 system back in the day. I know full well about heat dump. However I doubt I would be able to perceive the difference if I shut off a GPU since the issue is circulation.

    I open a door if it gets too warm. Even my laptop can heat up the room after some time.
     
    Vasudev and hmscott like this.
  32. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    I still find it funny how upset people are now but when the 580/480 were using literally 2x the power of their competitors, nobody cared.
     
    Vasudev and hmscott like this.
  33. hmscott

    hmscott Notebook Nobel Laureate

    Reputations:
    7,110
    Messages:
    20,384
    Likes Received:
    25,139
    Trophy Points:
    931
    Are you sure? I thought there was that big hullabaloo with the RX480 Reference edition pulling too much power from the PCI bus?

    There were people / reviewers that complained about the power draw, but RX 64 really goes further into power extreme's than RX 480 / RX 580 ever went.

    Really there needs to be an AMD OC update that recommends limiting power to a point of sanity, even if there are small performance gains by going further.

    I'm glad AMD didn't lock it down, but leaving it unlocked is resulting in reviewers and owners blowing the doors off with over powering these cards.

    Limiting the power to a reasonable limit will lose some performance numbers, but I don't think gamers are really gonna miss a few FPS if they can cut power usage by 100w-200w. :)
     
    Vasudev likes this.
  34. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    People still bought more 480s than 5870s. Why? Cause Nvidia.
     
    Vasudev and hmscott like this.
  35. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    LOLwut

    5870 (188-228W) vs. 470 (215W)

    6970 (250W) vs 570 (219W)
     
    Vistar Shook and Vasudev like this.
  36. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    The solution is still, open a door or a window.

    If thats not feasible then perhaps you shouldnt be looking into computers that dump enough heat to make your room a temp sauna box.
     
    hmscott likes this.
  37. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    5870s were never 220W. They average 180-190W under load. Meanwhile, GTX 4 80s were going as high as 350W and if you even touch voltage they'd skyrocket into 400W territory. Everyone basically agrees that the GPU power draw records are a tie between Vega 64 OC and GTX 480/580
     
    Vasudev and hmscott like this.
  38. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    It gets just cold enough in the winter here to make me consider it. But not enough for me to actually do it.
     
    Vasudev and hmscott like this.
  39. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    In the winter I would just crack the window a bit, dont want to generate condensation pending your climate...
     
    Vasudev and hmscott like this.
  40. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    228W TGP is what AMD listed for the 5870 2GB and Eyefinity Edition. In practice they didn't draw much more than the 1GB card.

    5870 and 6970 competed in same price and performance tier as 470 and 570, not 480 and 580.

    480 didn't draw that much by itself, you're probably talking total system draw. (Unthrottled FurMark might be different story but that's a bad example.)

    Nvidia listed TGP was on the conservative side for Fermi (esp. 480 very conservative as it was closer to 300W than 250W), but difference vs. AMD competitors wasn't that big:

    [​IMG]

    Edit: Don't forget that 480's draw was exacerbated by its poor reference cooler which made it run extremely hot, something Nvidia fixed with 580.
     
    Last edited: Sep 13, 2017
    Vistar Shook and Vasudev like this.
  41. don_svetlio

    don_svetlio In the Pipe, Five by Five.

    Reputations:
    351
    Messages:
    3,616
    Likes Received:
    1,825
    Trophy Points:
    231
    The 580 vs 6970 yes, the 580 was in a different class. The 480, however, was barely 5-10% faster than a 5870 and used a LOT more power. We're talking as much power as the 5970 dual-GPU. The original 480s had horrible silicon and horrible binning because of the size. Fermi 1 was a failure, a colossal one at that. Yet people still bought it. Why? Because branding is far and away more important than the product you are selling
    [​IMG]
    [​IMG]
    [​IMG]
    [​IMG]
     
    hmscott likes this.
  42. Carrot Top

    Carrot Top Notebook Evangelist

    Reputations:
    74
    Messages:
    319
    Likes Received:
    274
    Trophy Points:
    76
    But my point still stands. If you take away the outlier that was 480 (580 was much better despite being a fully enabled chip) and compare 5870 vs. 470 and 6970 vs. 570, the difference wasn't as big compared to now. V64 could be that outlier if it was actually faster than Nvidia's best--as 480 was--but it's only as good as Nvidia's 3rd/4th best.
     
    Vistar Shook likes this.
  43. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    I remember buying my GTX 470's hella cheap after 5XX launched. Though that was back when mGPU setups were actually worth it, unlike today
     
    hmscott likes this.
  44. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    Probably before they got in short supply, I remember the last few cards from each of those series selling for disgustingly large amounts of money.
     
  45. Reciever

    Reciever D! For Dragon!

    Reputations:
    1,530
    Messages:
    5,350
    Likes Received:
    4,375
    Trophy Points:
    431
    I bought them used :), I tend to never buy new. Its hard for me to justify it.
     
  46. Support.2@XOTIC PC

    Support.2@XOTIC PC Company Representative

    Reputations:
    486
    Messages:
    3,148
    Likes Received:
    3,490
    Trophy Points:
    331
    A good plan, it got ridiculous there at the end.
     
← Previous page