Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
AMD unveils its Vega 10 GPU architecture (tomshardware.com)
139 points by kevlar1818 on July 31, 2017 | hide | past | favorite | 86 comments


One significant advantage AMD might have is that they have chosen to provide full rate FP16 support:

http://www.tomshardware.com/news/amd-radeon-rx-vega-64-specs...

...unlike nVidia:

http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1...

While that's something that developers have to explicitly support, it will be interesting to see what happens when they do.


At 21TFLOPS for 400 USD it has good potential in the machine learning field, yes.

AFAIK AMD is busy pushing out support for MIOpen into Tensorflow etc.


For machine learning Tensor Cores in the upcoming Volta GPUs seem much better idea, delivering 120 TFLOPS. Although the price would be a multiple of 400 USD, for sure.


> For machine learning Tensor Cores in the upcoming Volta GPUs seem much better idea, delivering 120 TFLOPS.

Yes, but the real-world benchmark numbers nvidia published show a much smaller speedup than one would expect if one looked only at the 120TFLOPS number.

This is because (a) "tensor core" is just a fancy name for "fast 4x4 matmul instruction" -- it's not applicable to everything you might do on the GPU, and (b) memory bandwidth did not increase commensurate with compute speed.

(I work on the XLA GPU compiler in TensorFlow.)


The Radeon Pro SSG is where it's at:

> "The Radeon™ Pro SSG takes it one step further by having 2TB of solid state graphics memory on board"

Just insane. In a good way.

https://pro.radeon.com/en-us/radeon-pro-wx-9100-and-radeon-p...


"graphics memory" is probably not the right term here. While the Pro SSG does have 2TB of storage on board, that comes in the form of a NAND flash, PCIe SSD (from Samsung IIRC). While fast, this SSD provides single digit GB/s bandwidth at most, compared to the hundreds of GB/s of bandwidth that the DRAM-based GDDR5 or HBM provide. When we say "graphics memory," we're almost always referring to the latter, and never see more than maybe 32 GB of DRAM based memory on board.


Yeah, that seems like false advertising. I believe that much true HBM/GDDR5 would have to cost tens of thousands of dollars. So I was skeptical. That said: if I could get a card with that much real GDDR5 on it, I'd consider paying whatever it took.


More curious about the latency and how that impacts performance, even with DMA going over the PCIe bus to a NVMe SSD is a longer trek.


GPUs are generally not latency optimized anyway - if an app is sensitive to tens of microseconds then it belongs on the CPU, broadly speaking.

AMD's advantage here is getting drop-in throughput, by switching the SSD directly to the GPU. Thus, the onboard SSD doesn't waste host PCIe lanes (4 lanes, typically) that are getting hard to come by in regular desktop computing systems.


Isn't the "2TB" NVMe SSD (in this case) directly interfaced to the card, so not going over the system PCIe bus?


It is, which is why I'm curious how this improves performance compared to exactly the scenario of hitting NVMe storage over the bus.


Indeed, the non-Pro variant only gets you 512GB of mechanical memory


"having 2TB of solid state graphics memory"

That made me chuckle. What else would it be? Vacuum tube graphics memory?


Dayum. Now if they would just port Thrust to OpenCL I'd be in business.


I am surprised this claim is buried in the article with no highlight whatsoever.


Who is this for then? I should just go Nvidia for my gaming rig?

The RX 480 last year had an amazing price/performance ratio. It was a very exciting product for the mid range gamer. It seems with Vega they are offering slightly worse than Nvidia's high end for slightly less money.

Better performance at the same price would have been compelling. Slightly worse at a bargain price would have been compelling.

Plus, you won't be able to buy these anyway because of mining. This is just sad. Maybe they should just remarket themselves as a mining company.


Slightly worse for slightly cheaper could still be attractive. Factor in prices for displays with freesync display vs displays with gsync, and the price difference to Nvidia get a a lot bigger, in favor of AMD. But it depends on whether the cards will be available, and which performance will be reached.


Seeing as the Vega 64 "starts at" $500 and I was able to pick up a 1080 TI for about $700 I don't see who the target audience is. If you are the part of the market that is willing to spend $500+ just for a new graphics card then you likely aren't going to get the subpar card just to save a few dollars.


If you already have a G-Sync display, you're kind of locked in. I had super high hopes for Vega (if only to apply downwards pressure to NVIDIA's pricing), but ended upgrading from a 980 to a 1080TI a couple of months ago as the launch was dragging out.

With the numbers I'm seeing, I'm glad I chose the TI, and didn't wait. I don't see NVIDIA reducing their pricing over this.


Apparently Ethereum mining has become less profitable, due to an "ice age" and will become even less so in about 21 days: https://www.reddit.com/r/EtherMining/comments/6qqte9/difficu...


However there are plenty of other profitable altcoins that can be mined just as well with a GPU.


They may not be that attractive to miners given the higher TDP than nvidia. Guess we'll see.


This is the biggest disappointment for me, blower Nvidia cards are basically unobtainium because of miners, I was hoping we'd get a second source for the Ethereum bubble boys.


I recently bought a reference design RX 480 and I can run 1440p at 60fps. Some games even run at 4k with 40-60fps. I paid $150 after rebate.


There are very few titles that will run 4k at 60fps. I had dual RX480 (8GB variant) in crossire and could not get witcher 3 to stay over 30fps. Even overwatch would not stay at 60fps.


Because you didn't tweak the settings. My RX 480 gets around 90 fps in Overwatch at 4K. All you need to do is to disable dynamic reflections and drop effects quality to low. (Keep models/textures at maximum.)


Reviewers: make sure to test mining performance of these cards. Not that we care about that, but to give us an idea of whether it'll be possible to actually buy these things.


AMD should just run their own mining rig at this point. Create a large multisocket board and plug all those GPUs in for a week of 'burn in' mining coins before shipping them out.


You think they don't?


AMD are actually selling a portion of the GPU stock in bundles so that they don't all get snapped up by miners.


But will it work, or will miners gladly pay the extra $100? AMD would love that -- miners won't redeem the games or the discounts, so that extra $100 is pure profit.


Is that how those bundles work? AMD only pays when the game gets activated/redeemed? I always thought they just buy a bunch of game licenses and pay upfront.


Miners might want to sell the games/discounts to recoup their losses. After all, if you get a $50 discount on a $200 monitor then you can just re-sell it new for $200 and get your money back.


Which is US only. The only reason to buy was the bundle, which is a good deal. Same performance as 1080 at the same price with double the TDP and will have limited stock. There is no reason to buy Vega as a gamer.


AFAICT the vega "Frontier" card which has been out for a little while (at a very high price) has a similar mining output to a 1080Ti

The Vega frontier board is clocked faster than the Vega 64s and has twice as much RAM, and RAM bandwidth.

I'm going to take a guess that these are going to be slightly less attractive to miners than the Nvidia 1070. Wouldn't like to put money on it though.

--edit-- Power consumption on these will be the deciding factor, possibly. Nvidia seem to have a far better handle on that lately.


They do spend a bit of the article explaining AMDs new bundling deal that looks to be explicitly anti-miner.


Slide deck mentions cryptocurrency specific instructions.


Unlikely considering 275W+ TDP


Yes please, so 1080ti cards get their prices down to normal and I can buy one.


Vote me down, but Amazon.de increased prices by 30% due to Ethereum miners demand to > $1050.


That may be, but currently it is at 760€, and a bit cheaper for worse models. https://www.amazon.de/GIGABYTE-GeForce-1080TI-Gaming-GDDR5X/... as example.


That card had a 650€ price mid-June before miners bought up everything, but yes 760€ is already down from the 820€ peak.


Vega is a failure aka bulldozer 2.0 Same performance as 1080, a year later with double the TDP. Only 30% higher than the Fury which was their high end card from 2 years ago. This whole product launch was terrible from AMD laughing at Nvidia with "Poor Volta" to "blind tests" to no live streaming of the SIGGRAPH event.


That is a very pessimistic view.

The GTX 1080 is $550; Vega 56 is $400 and Vega 64 is $500. With decent cooling, the vega gpus can be overclocked to give significant boost. Also the freesync monitor is cheaper than gsync. Vega56 + FreeSync will save you around $350. That is ~ 25% saving if you are going for a 1.2-1.5k$ rig.


With decent cooling, the vega gpus can be overclocked to give significant boost.

I don't think so. The article touched on the fact that the watercooled one is already seeing a disproportionate ramp-up of power in order to ramp up clocks. That says to me that it's beginning to eat in to the headroom of the card and will soon begin to run up against its voltage wall. That suggests to me that they're already pushing the card pretty hard to get the stock performance levels, which doesn't leave a lot on the table for overclockers.


The GTX 1080 is actually $500 MSRP. The Vega overclocking is surely inferior to the GTX OC. The OP is unfortunately right I think, trying to enter the Radeon ecosystem at this moment is foolhardy.


What makes you believe there is any room to overclock the Vega cards? And how much additional cost does "decent cooling" add?


Vega is already drawing near 300W, and is so high up on the voltage/frequency curve that even a measly 5-10% gain in core clock can easily cost over 100W more.

Here, AMD is once again (see last year's Polaris) a victim of the inferior GlobalFoundries 14nm LPP process. TSMC 16nm would have been much better in perf/W, but sadly a very restrictive wafer supply agreement locks AMD to GloFo for the time being.


I would have had the same conclusion if Ryzen was not also a 14nm LPP with GloFo.

It is just sad AMD could not get both Graphics and CPU momentum at the same time.


Sure, for desktops.

Poor thermals means that this line of GPUs will find it's way into exactly 0% of the laptop market, which is unfortunate.


> into exactly 0% of the laptop market

Alienware will find a way, even if it requires a water connection to cool it


Oh man that would be so heavy haha. But I won't rule it out!


Too late, raven ridge with integrated vega taped out a long time ago. One desktop gaming card line-up operating in the wrong region of the efficiency curve doesn't instantly turn an entire architecture into house fire material.


Whatever the initial price difference there is will be lost in power consumption over the course of the first year of heavy use or 2 years of moderate use.

And after the price parity, you're just stuck with a worse card.


Falsehoods & misrepresentations.

"Same performance as 1080"

Vega 56 is faster than the 1080: 15% faster in SP GFLOPS, 130%(!) faster in DP GFLOPS, and 15000%(!!) faster in half-precision GFLOPS (Nvidia artificially cripples theirs). All comparisons made at base clocks.

"double the TDP"

Merely 17% higher: 210W (Vega 56) vs 180W (1080).

"Only 30% higher than the Fury"

And Nvidia's top 16nm GPU (Titan Xp) is only 39% faster than their top 28nm (TITAN Z) released... 3 years ago. The 28nm→14/16nm transition was hard for everyone.


http://imgur.com/5A2CFuP

Saw it on /r/AMD. This is supposedly from SIGGRAPH.

If AMD says you can't go higher than 1080 (in a cherrypicked setting, no doubt) then you cannot make the claim that it is faster.


This link shows Vega significantly faster than the 1080. Minimum FPS is 65% higher. A low minimum FPS indicates performance bottlenecks in the 1080.


They could salvage this if they can get tensorflow working.


The blind tests show something useful - consistency of throughput, which people notice, regardless of how benchmarks work.

Benchmarking software should do more to provide a clean and meaningful representation of the distribution of performance - people care more about a card's worst case than its best or average.

Another significant consideration - as alluded to in the TensorFlow comment - is the compute performance.

Although compute performance is not yet that important from a gaming perspective, I would be rather surprised not to see some games start to utilise it for engine functionality - especially after Vulcan and OpenCL merge.

Wait to see the performance of the card as a whole, and consider the implications for the next generation before you declare the card a non-starter and evolutionary dead-end.


1) Blind Test - That's true, but still people who buy check benchmarks, don't forget AMD did exactly that with Bulldozer and we know how that ended up.... If AMD thought that they had a winning horse they would not hide the performance numbers, they would PR that like crazy, like they are doing with Ryzen and ThreadRipper.

2) I'm all for AMD to kick Nvidia's ass in compute and end cuds's reign but it won't sell gaming cards. Vega FE (A 1000$ card) is out for a month and I haven't seen anyone benchmark it and show that it's better, Even if games will use it there is a heck of a difference between building the model and running inference.


It won't sell gaming cards yet. My argument was that comparing underutilised hardware to that with fundamental unresolvable physical bottlenecks seemed to be a little premature.

The next generation of games could potentially utilise compute functionality, especially if Microsoft and Sony get off their arses and push the capability on consoles.

(That said, porting such software to a PC environment would not be withouut significant difficulty, given the vastly different memory configuration.)

I'll be hugely disappointed in the industry if we don't see some decent Vulcan based game engines in the next few years.


consoles tend to have lower level hardware access then opengl or even dx12 and vulkan


Benchmarking software should do more to provide a clean and meaningful representation of the distribution of performance - people care more about a card's worst case than its best or average.

You should check Tech Report's Frame Time Analysis:

http://techreport.com/review/31546/where-minimum-fps-figures...


The single frame that takes the longest time to render won't normally be an issue, but a cluster of frames each taking significantly long than usual will result in a noticable stutter. The stats should try to take into account not only the low frame-rate but whether that rate has been held long enough to peturb a human observer.


Only AMD can really say whether Vega turns out to be a failure or not. We don't know what their goals or constraints were. In the end, if it has a performance/price advantage over Nvidia's (year-old) offerings, then it will be at least a relative "success" from a consumer pov.

It seems to be bringing some technological advancement too - using GPU memory as a cache seems like the way forward. Naive calculation gives me a couple of MB to store a single 1080x1920 texture. Current game usage is approx 1000 times that. It takes more than one texture to build a scene, but there seem to be gains if more textures are loaded on-demand (or slightly earlier).


almost of all textures use some kind of compression, and gpus support a few schemes


With the failure of RX Vega to meet performance expectations, I fear Nvidia has free reign now to continue their dominance in the areas of deep learning / machine learning.

I was really quite optimistic, especially with their recent push with the ROCm platform that AMD has been building up. But if the hardware can't back it up, it means little in the end. It seems like we'll have to wait until the next generation to see if any viable contenders will show up. Disappointing to say the least.


It didn't meet expectations of gamers who wanted it to beat the 1080 Ti in games. But the compute performance is really good.


Even in terms of raw compute, the current Pascal iterations like the 1080 Ti, Titan Xp, etc. perform better than the RX Vega.

In addition to a significantly lower TDP par for par and an established developer environment that currently is unparalleled, much to the frustration of myself and many within the machine learning community as Nvidia has chosen to keep their software closed source, the RX Vega release to be very blunt, is a definitively disappointing result regardless of the distinction between gaming or computation.

I think there is a silver lining somewhere, however, in anticipation for the next generation releases of AMD in a year or two. AMD's investment into their ROCm platform will hopefully continue and iteratively, the open source option will slowly continue to improve. By the time AMD is ready to release their Navi architecture in 2018/2019, perhaps there will be a chance to bring things back towards an equilibrium.

Another contender I see in the nearer future perhaps that might be able to topple Nvidia's current monopoly on deep learning, is Google's investment into their TPU architecture. In addition to fully owning the development of TensorFlow, with the resources that Google has at its disposal, I could easily see the company investing in building out an full end-to-end alternative that takes away the current stranglehold that CUDA has.

Wherever the competition comes from, I hope that it comes sooner rather than later. Closed source single option monopolies that Nvidia currently has managed to position itself into only serve to hurt the consumer.


I can perfectly understand if you disagree with me and am happy to discuss. But I don't feel like anything in my post was misleading or incorrect, so I'm confused as to why I'm being downvoted?


https://img.purch.com/radeon-rx-vega/o/aHR0cDovL21lZGlhLmJlc...

So basically $700 dollar card is worse then 1080. You can buy 1080TI for around ~$900 which would completely destroy this AMD "beast".

Not sure where the hype comes from, or am I missing something?


You're linking to a graph that claims that a $500 card is better than a 1080. 53 fps is bigger than 45.


The GTX 1080's current MSRP is also $500, though, and it's been on the market for over a year already.

Also, those numbers only represent the "minimum FPS range" for select titles. While minimum FPS is important, the picture is incomplete without other figures such as average FPS.

Plus, AMD clearly cherry-picked one of the benchmarks (Deus Ex MD, presumably using DirectX 12) just to lower the GTX 1080's range in that comparison: https://www.pcper.com/files/review/2017-07-30/vega-35.jpg

It's marketing's job to shine the best light possible on their product, of course, but it says something when Vega still doesn't look terribly impressive despite that effort.


And 78 > 76, in favor of Nvidia. But they don't list average or 99%ile performance, and manufacturer-provided benchmarks are notoriously biased, so we'll have to wait and see the independent reviews before anyone can claim performance superiority.


If you have a Micro Center you can get GTX 1080 Ti cards starting around $700 if you can find them in stock. I know I did.

Newegg seems to have similar pricing. https://www.newegg.com/Product/ProductList.aspx?Submit=ENE&I...



The RX 64 is a $500 card.

If a $500 card is roughly the same as a 1080 (currently $550), I think that's fair.

The $700 liquid-cooled Vega should overclock better, but I have my doubts for its efficiency. Overclocking isn't very good for heat and noise and reliability.


The $900 mark was also for a card that is not stock. I was talking about ASUS Strix version which is after-market with 3 coolers.


Water cooling is pretty effective at fixing heat and noise problems, and heat is one of the big factors in reliability too.


I really hope that AMD gets better deep learning support so it will push NVIDIA's pricing structure down. Also, AMD's processor lineup looks very appealing compared to Intel (which has had more pressure on their solutions also. (moved from dupe thread).


I'm surprised at the absolute lack of news on how Vega FE performs in deep learning...


Backends for the deep learning frameworks in use are CPU or CUDA. OpenCL support is essentially non-existent. There's no point benchmarking something that isn't going to be used.


Just curious any people own $AMD shares? I was surprised to learn that AMD is the second most owned company on Robinhood trading platform behind Apple. I bought in at $10.47 after last earnings absolutely destroyed it. Just sold right before current earnings at $14.12 and looking to maybe buy back in lower than I sold for. Closed today at $13.61.


That's interesting, didn't know the stock was so popular on Robinhood. I remember around 2 years ago their stock price was in the $2 range and r/wallstreetbets were all talking about going "YOLO" with AMD. Pretty nice comeback for their stock. Lisa Su seems like a great CEO.


Lisa made a good call bringing Jim Keller back to work on Zen, an investment that seems to have paid off extremely well.


I wonder if it would be feasible to couple GPU dies similar to threadripper or EPYC.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: