(Leak) AMD Vega 10 and Vega 20 Information Leaked

Subject: Graphics Cards | January 8, 2017 - 03:53 AM |
Tagged: vega 11, vega 10, navi, gpu, amd

During CES, AMD showed off demo machines running Ryzen CPUs and Vega graphics cards as well as gave the world a bit of information on the underlying architecture of Vega in an architectural preview that you can read about (or watch) here. AMD's Vega GPU is coming and it is poised to compete with NVIDIA on the high end (an area that has been left to NVIDIA for awhile now) in a big way.

Thanks to Videocardz, we have a bit more info on the products that we might see this year and what we can expect to see in the future. Specifically, the slides suggest that Vega 10 – the first GPUs to be based on the company's new architecture – may be available by the (end of) first half of 2017. Following that a dual GPU Vega 10 product is slated for a release in Q3 or Q4 of 2017 and a refreshed GPU based on smaller process node with more HBM2 memory called Vega 20 in the second half of 2018. The leaked slides also suggest that Navi (Vega's successor) might launch as soon as 2019 and will come in two variants called Navi 10 and Navi 11 (with Navi 11 being the smaller / less powerful GPU).

AMD Vega Leaked Info.jpg

The 14nm Vega 10 GPU allegedly offers up 64 NCUs and as much as 12 TFLOPS of single precision and 750 GFLOPS of double precision compute performance respectively. Half precision performance is twice that of FP32 at 24 TFLOPS (which would be good for things like machine learning). The NCUs allegedly run FP16 at 2x and DPFP at 1/16. If each NCU has 64 shaders like Polaris 10 and other GCN GPUs, then we are looking at a top-end Vega 10 chip having 4096 shaders which rivals that of Fiji. Further, Vega 10 supposedly has a TDP up to 225 watts.

For comparison, the 28nm 8.9 billion transistor Fiji-based R9 Fury X ran at 1050 MHz with a TDP of 275 watts and had a rated peak compute of 8.6 TFLOPS. While we do not know clock speeds of Vega 10, the numbers suggest that AMD has been able to clock the GPU much higher than Fiji while still using less power (and thus putting out less heat). This is possible with the move to the smaller process node, though I do wonder what yields will be like at first for the top end (and highest clocked) versions.

Vega 10 will be paired with two stacks of HBM2 memory on package which will offer 16GB of memory with memory bandwidth of 512 GB/s. The increase in memory bandwidth is thanks to the move to HBM2 from HBM (Fiji needed four HBM dies to hit 512 GB/s and had only 4GB).

The slide also hints at a "Vega 10 x2" in the second half of the year which is presumably a dual GPU product. The slide states that Vega 10 x2 will have four stacks of HBM2 (1TB/s) though it is not clear if they are simply adding the two stacks per GPU to claim the 1TB/s number or if both GPUs will have four stacks (this is unlikely though as there does not appear to be room on the package for two more stacks each and I am not sure if they could make the package bit enough to make room for them either). Even if we assume that they really mean 2x 512 GB/s per GPU (and maybe they can get more out of that in specific workloads across both) for memory bandwidth, the doubling of cores and at least potential compute performance will be big. This is going to be a big number crunching and machine learning card as well as for games of course. Clockspeeds will likely have to be much lower compared to the single GPU Vega 10 (especially with stated TDP of 300W) and workloads wont scale perfectly so potential compute performance will not be quite 2x but should still be a decent per-card boost.

AMD-Vega-GPU.jpg

Raja Koduri holds up a Vega GPU at CES 2017 via eTeknix

Moving into the second half of 2018, the leaked slides suggest that a Vega 20 GPU will be released based on a 7nm process node with 64 CUs and paired with four stacks of HBM2 for 16 GB or 32 GB of memory with 1TB/s of bandwidth. Interestingly, the shaders will be setup such that the GPU can still do half precision calculations at twice that of single precision, but will not take nearly the hit on double precision at Vega 10 at only 1/2 single precision rather than 1/16. The GPU(s) will use between 150W and 300W of power, and it seems these are set to be the real professional and workstation workhorses. A Vega 10 with 1/2 DPFP compute would hit 6 TFLOPS which is not bad (and it would hopefully be more than this due to faster clocks and architecture improvements).

Beyond that, the slides mention Navi's existence and that it will come in Navi 10 and Navi 11 but no other details were shared which makes sense as it is still far off.

You can see the leaked slides here. In all, it is an interesting look at potential Vega 10 and beyond GPUs but definitely keep in mind that this is leaked information and that the information allegedly came from an internal presentation that likely showed the graphics processors in their best possible/expect light. It does add a bit more hope to the fire of excitement for Vega though, and I hope that AMD pulls it off as my unlocked 6950 is no longer supported and it is only a matter of time before new games perform poorly or not at all!

Also read: 

Source: eTeknix.com
Author:
Subject: Motherboards
Manufacturer: AMD

AM4 Edging Closer to Retail

Many of us were thinking that one of the bigger stories around CES would be the unveiling of a goodly chunk of AM4 motherboards.  AM4 has been around for about half a year now, but only in system integrator builds (such as HP).  These have all been based around Bristol Ridge APU (essentially an updated Carrizo APU).  These SOCs are not exactly barn burners, but they provide a solid foundation for a low-cost build.  The APUs features 2 modules/4 cores, a GCN based GPU, and limited southbridge I/O functionality.

am4_01.jpg

During all this time the motherboards available from these OEMs are very basic units not fit for retail.  Users simply could not go out and buy a Bristol Ridge APU and motherboard for themselves off of Newegg, Amazon, and elsewhere.  Now after much speculation we finally got to see the first AM4 retail style boards unveiled at this year’s CES.  AMD showed off around 16 boards based on previously unseen B350 and X370 chipsets.

AMD has had a pretty limited number of chipsets that they have introduced over the years.  Their FM2+ offerings spanned the A series of chipsets, but they added very little in terms of functionality as compared to the 900 series that populate the AM3+ world.  The latest offering from AMD was the A88x which was released in September 2013.  At one time there was supposed to be a 1000 series of chipsets for AM3+, but those were cancelled and we have had the 900 series (which are identical to the previous 800 series) since 2011.  This has been a pretty stagnant area for AMD and their partners.  3rd party chips have helped shore up the feature divide between AMD and Intel’s regular release of new chipsets and technologies attached to them.

am4_02.jpg

There are three primary chipsets being released as well as two physical layer chips that allow the use of the onboard southbridge on Ryzen and Bristol Ridge.  The X370 for the enthusiast market, the B350 for the mainstream, and then the budget A320.  The two chipset options for utilizing the SOC’s southbridge functionality are the X300 and A/B300.

Before we jump into the chipsets we should take a look at what kind of functionality Ryzen and Bristol Ridge have that can be leveraged by motherboard manufacturers.  Bristol Ridge is a true SOC in that it contains the GPU, CPU, and southbridge functionality to stand alone.  Ryzen is different in that it does not have the GPU portion so it still requires a standalone graphics card to work.  Bristol Ridge is based off of the older Carrizo design and does not feature the flexibility in its I/O connections that Ryzen does.

am4_03.png

Bristol Ridge features up to 8 lanes of PCI-E 3.0.  The I/O on it includes 2 native SATA6G ports as well as the ability to either utilize two more PCI-e lanes or have them as x2 NVME.  That is about as flexible as it gets.  It also natively supports four USB 3.1 gen 1 ports.  For a chip that was designed to be a mobile focused SoC it makes sense that it will not max out PCI-E lanes or SATA ports.  It still is enough to satisfy most mobile and SFF builds.

Click here to read more about AMD's AM4 platform!

CES 2017: AMD Vega Running DOOM at 4K

Subject: Graphics Cards | January 6, 2017 - 11:27 PM |
Tagged: Vega, doom, amd

One of the demos that AMD had at CES was their new Vega architecture running DOOM with Vulkan on Ultra settings at 4K resolution. With this configuration, the pre-release card was coasting along at the high 60s / low 70s frames per second. Compared to PC Gamer’s benchmarks of the Vulkan patch (ours was focused on 1080p) this puts Vega somewhat ahead of the GTX 1080, which averages the low 60s.

Some of the comments note that, during one of the melee kills, the frame rate stutters a bit, dropping down to about 37 FPS. That’s true, and I included a screenshot of it below, but momentary dips sometimes just happen. It could even be a bug in pre-release drivers for a brand new GPU architecture, after all.

amd-2017-ces-vegadip.png

Yes, the frame rate dipped in the video, but stutters happen. No big deal.

As always, this is a single, vendor-controlled data point. There will be other benchmarks, and NVIDIA has both GP102 and Volta to consider. The GTX 1080 is only ~314 mm2, so there’s a lot more room for enthusiast GPUs to expand on 14nm, but this test suggests Vega will at least surpass it. (When a process node is fully mature, you will typically see low-yield chips up to around 600mm2.)

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at http://pcper.com/ces!

Keep your eyes peeled for FreeSync 2

Subject: Graphics Cards, Displays, Shows and Expos | January 6, 2017 - 03:19 PM |
Tagged: freesync 2, amd

So far we have yet to see a Freesync 2 capable monitor on the floor at CES but we do know about the technology.  We have seen Ryan's overview of what we know of the new technology and its benefits and recently The Tech Report also posted their thoughts on it.  For instance, did you know that there are 121 FreeSync displays from 20 display partners of various quality, compared to NVIDIA eight partners and 18 GSYNC displays.  The Tech Report are also on the hunt for a Freesync 2 display at CES, we will let you know once the hunt is successful.

modeswitch.jpg

"AMD has pulled back the curtain on FreeSync 2, the new version of the FreeSync variable refresh rate technology."

Here are some more Graphics Card articles from around the web:

Graphics Cards

CES 2017: MSI Shows Off X370 XPower Gaming Titanium AM4 Motherboard

Subject: Motherboards | January 5, 2017 - 09:59 PM |
Tagged: x370, msi, CES 2017, CES, amd, AM4

MSI is in full force at CES 2017 this year and in addition to external GPU docks, laptops, and a wall of Z270 motherboards, the company is using the event to show off its AM4 motherboards for the first time. At the top of the pack is the flagship X370 XPower Gaming Titanium clad in shimmering white and black and ready to support AMD’s upcoming “Ryzen” processors based on the Zen architecture.

Screenshot (115).png

Image Source: eTeknix (video)

Armed with AMD’s highest end X370 chipset, the flagship motherboard supports Ryzen CPUs as well as 7th generation APUs. There are four DDR4 memory slots on the board that MSI claims can operate in dual channel mode at up to 2667 MHz when overclocked. The board is rather powerful as well in that there are a lot of power connections. The usual 24-pin ATX and 8-pin EPS are joined by a 4-pin ATX up top and a 6-pin PCI-E connector on the bottom to stabilize overclocks. Further, a 10-phase VRM drives the AM4 socket and memory.

The motherboard further features three PCI-E x16 slots, three PCI-E x1 slots, two M.2 (32 Gb/s), one U.2 (32 Gb/s), and six SATA 6 Gbps ports. On the PCI-E front, the first two slots are connected to the processor and run at x8 when both slots are populated while the third slot is connected to the chipset and is electrically x4. As far as the SATA ports, two are directly connected to the processor and four come from the chipset.

MSI is including their own onboard audio solution called Audio Boost 4 as well as Gigabit Ethernet though the Steel Armor is covering the chips so it is hard to say exactly what they are using there. Also, Multi GPU setups are officially supported up to two way SLI HB or three way CrossFireX.

There is a plethora of USB support here with the X370 XPower Gaming Titanium supporting up to four USB 3.1 (one Type-C, three Type-A), eight USB 3.0, and seven USB 2.0 ports. If that is not enough, then you may have an addition to PC accessories (or just really like those USB missle batteries hehe).

Naturally, MSI is staying silent on pricing and availability of the motherboard(s) but I hope to see them out within the first half of the year! The design is certainly polarizing, but the features are there. I look forward to reading our review of this and other AM4 motherboards and how they fare at overclocking!

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

Source: TechPowerUp

High Bandwidth Cache

Apart from AMD’s other new architecture due out in 2017, its Zen CPU design, there is no other product that has had as much build up and excitement surrounding it than its Vega GPU architecture. After the world learned that Polaris would be a mainstream-only design that was released as the Radeon RX 480, the focus for enthusiasts came straight to Vega. It’s been on the public facing roadmaps for years and signifies the company’s return to the world of high end GPUs, something they have been missing since the release of the Fury X in mid-2015.

slides-2.jpg

Let’s be clear: today does not mark the release of the Vega GPU or products based on Vega. In reality, we don’t even know enough to make highly educated guesses about the performance without more details on the specific implementations. That being said, the information released by AMD today is interesting and shows that Vega will be much more than simply an increase in shader count over Polaris. It reminds me a lot of the build to the Fiji GPU release, when the information and speculation about how HBM would affect power consumption, form factor and performance flourished. What we can hope for, and what AMD’s goal needs to be, is a cleaner and more consistent product release than how the Fury X turned out.

The Design Goals

AMD began its discussion about Vega last month by talking about the changes in the world of GPUs and how the data sets and workloads have evolved over the last decade. No longer are GPUs only worried about games, but instead they must address profession workloads, enterprise workloads, scientific workloads. Even more interestingly, as we have discussed the gap in CPU performance vs CPU memory bandwidth and the growing gap between them, AMD posits that the gap between memory capacity and GPU performance is a significant hurdle and limiter to performance and expansion. Game installs, professional graphics sets, and compute data sets continue to skyrocket. Game installs now are regularly over 50GB but compute workloads can exceed petabytes. Even as we saw GPU memory capacities increase from Megabytes to Gigabytes, reaching as high as 12GB in high end consumer products, AMD thinks there should be more.

slides-8.jpg

Coming from a company that chose to release a high-end product limited to 4GB of memory in 2015, it’s a noteworthy statement.

slides-11.jpg

The High Bandwidth Cache

Bold enough to claim a direct nomenclature change, Vega 10 will feature a HBM2 based high bandwidth cache (HBC) along with a new memory hierarchy to call it into play. This HBC will be a collection of memory on the GPU package just like we saw on Fiji with the first HBM implementation and will be measured in gigabytes. Why the move to calling it a cache will be covered below. (But can’t we call get behind the removal of the term “frame buffer”?) Interestingly, this HBC doesn’t have to be HBM2 and in fact I was told that you could expect to see other memory systems on lower cost products going forward; cards that integrate this new memory topology with GDDR5X or some equivalent seem assured.

slides-13.jpg

Continue reading our preview of the AMD Vega GPU Architecture!

Meet AMD's new chipset, the X300 and X370

Subject: Motherboards, Shows and Expos | January 4, 2017 - 11:00 PM |
Tagged: x370, x300, ryzen, CES 2017, CES, amd, AM4

Tonight at CES AMD announced 16 new AM4 motherboards from five manufacturers and new PCs from over a dozen system builders.  The motherboards will all bring support for dual channel DDR4 memory, NVMe, M.2 SATA devices and USB 3.1 support for AMD users. You can also expect at least 24 lanes of PCIe 3.0, perhaps more if we can spot some with bridge chips.

The MSI A320 Pro-VD and B350 Tomahawk (not to scale)

msi.png

MSI will be showing off their A320M Pro-VD, X370 XPower Gaming Titanium, B350 Tomahawk and B350M Mortar. 

Gigabyte's GA-X370 Gaming K5, GA-AX370-Gaming 5 and GA-AB350-Gaming 3.

gigabyte.png

The gang should have updates on the full lineup soon including the Gigabyte A320M-HD3 not pictured above.

Biostar's X370 GT7 TOP and B350 GT3 TOP

biostar.png

ASRock's ASRock Fatal1ty X370 Professional Gaming and X370 Taichi.

asrock.png

There are more models on display and ASUS did announce the Asus B350M-C though we did not yet get a picture of it.  All motherboards will offer compatibility with at least some existing coolers, we know for a fact that Corsair's Hydro H60, H100i and H110i as well as Noctua's NH-U12S, NHL9x65 and D15 will all be supported.

 

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at https://pcper.com/ces!

Source: AMD

AMD FreeSync 2 Brings Latency, LFC and Color Space Requirements

Subject: Graphics Cards, Displays | January 3, 2017 - 09:00 AM |
Tagged: srgb, lfc, hdr10, hdr, freesync 2, freesync, dolby vision, color space, amd

Since the initial FreeSync launch in March of 2015, AMD has quickly expanded the role and impact that the display technology has had on the market. Technologically, AMD added low frame rate compensation (LFC) to mimic the experience of G-Sync displays, effectively removing the bottom limit to the variable refresh rate. LFC is an optional feature that requires a large enough gap between the displays minimum and maximum refresh rates to be enabled, but the monitors that do integrate it work well. Last year AMD brought FreeSync to HDMI connections too by overlaying the standard as an extension. This helped to expand the quantity and lower the price of available FreeSync options. Most recently, AMD announced that borderless windowed mode was being added as well, another feature-match to what NVIDIA can do with G-Sync.

slides-3.jpg

The biggest feather in the cap for AMD FreeSync is the sheer quantity of displays that exist on the market that support it. As of our briefing in early December, AMD claimed 121 design wins for FreeSync to just 18 for NVIDIA G-Sync. I am not often in the camp of quantity over quality, but the numbers are impressive. The pervasiveness of FreeSync monitors means that at least some of them are going to be very high quality integrations and that prices are going to be lower compared to the green team’s selection.

slides-14.jpg

Today AMD is announcing FreeSync 2, a new, concurrently running program that adds some new qualifications to displays for latency, color space and LFC. This new program will be much more hands-on from AMD, requiring per-product validation and certification and this will likely come at a cost. (To be clear, AMD hasn’t confirmed if that is the case to me yet.)

Let’s start with the easy stuff first: latency and LFC. FreeSync 2 will require monitors to support LFC and thus to have no effective bottom limit to their variable refresh rate. AMD will also instill a maximum latency allowable for FS2, on the order of “a few milliseconds” from frame buffer flip to photon. This can be easily measured with some high-speed camera work by both AMD and external parties (like us).

These are fantastic additions to the FreeSync 2 standard and should drastically increase the quality of panels and product.

slides-17.jpg

The bigger change to FreeSync 2 is on the color space. FS2 will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards. This means that any monitor that has the FreeSync 2 brand will have a significantly larger color space and ~400 nits brightness. Current HDR standards exceed these FreeSync 2 requirements, but there is nothing preventing monitor vendors from exceeding these levels; they simply set a baseline that users should expect going forward.

slides-16.jpg

In addition to just requiring the panel to support a wider color gamut, FS2 will also enable user experience improvements as well. First, each FS2 monitor must communicate its color space and brightness ranges to the AMD driver through a similar communication path used today for variable refresh rate information. By having access to this data, AMD can enable automatic mode switches from SDR to HDR/wide color gamut based on the application. Windows can remain in a basic SDR color space but games or video applications that support HDR modes can enter that mode without user intervention.

slides-15.jpg

Color space mapping can take time in low power consumption monitors, adding potential latency. For movies that might not be an issue, but for enthusiast gamers it definitely is. The solution is to do all the tone mapping BEFORE the image data is sent to the monitor itself. But with varying monitors, varying color space limits and varying integrations of HDR standards, and no operating system level integration for tone mapping, it’s a difficult task.

The solution is for games to map directly to the color space of the display. AMD will foster this through FreeSync 2 – a game that integrates support for FS2 will be able to get data from the AMD driver stack about the maximum color space of the attached display. The engine can then do its tone mapping to that color space directly, rather than some intermediate state, saving on latency and improving the gaming experience. AMD can then automatically switch the monitor to its largest color space, as well as its maximum brightness. This does require the game engine or game developer to directly integrate support for this feature though – it will not be a catch-all solution for AMD Radeon users.

This combination of latency, LFC and color space additions to FreeSync 2 make it an incredibly interesting standard. Pushing specific standards and requirements on hardware vendors is not something AMD has had the gall to do the past, and honestly the company has publicly been very against it. But to guarantee the experience for Radeon gamers, AMD and the Radeon Technologies Group appear to be willing to make some changes.

slides-18.jpg

NVIDIA has yet to make any noise about HDR or color space requirements for future monitors and while the FreeSync 2 standards shown here don’t quite guarantee HDR10/Dolby Vision quality displays, they do force vendors to pay more attention to what they are building and create higher quality products for the gaming market.

All GPUs that support FreeSync will support FreeSync 2 and both programs will co-exist. FS2 is currently going to be built on DisplayPort and could find its way into another standard extension (as Adaptive Sync was). Displays are set to be available in the first half of this year.

Coverage of CES 2017 is brought to you by NVIDIA!

PC Perspective's CES 2017 coverage is sponsored by NVIDIA.

Follow all of our coverage of the show at http://pcper.com/ces!

Source: AMD

AMD's Drumming Up Excitement for Vega

Subject: Graphics Cards | January 2, 2017 - 02:56 PM |
Tagged: Vega, amd

Just ahead of CES, AMD has published a teaser page with, currently, a single YouTube video and a countdown widget. In the video, a young man is walking down the street while tapping on a drum and passing by Red Team propaganda posters. It also contains subtle references to Vega on walls and things, in case the explicit references, including the site’s URL, weren’t explicit enough.

amd-2017-ces-vega.jpg

How subtle, AMD.

Speaking of references to Vega, the countdown widget claims to lead up to the architecture preview. We were expecting AMD to launch their high-end GPU line at CES, and this is the first (official) day of the show. Until it happens, I don’t really know whether it will be a more technical look, or if they will be focusing on the use cases.

The countdown ends at 9am (EST) on January 5th.

Source: AMD

AMD Releases Radeon Software Crimson ReLive 16.12.2

Subject: Graphics Cards | December 20, 2016 - 02:41 PM |
Tagged: graphics drivers, amd

Last week, AMD came down to the PC Perspective offices to show off their new graphics drivers, which introduced optional game capture software (and it doesn’t require a login to operate). This week, they are publishing a new version of it, Radeon Software Crimson ReLive Edition 16.12.2, which fixes a huge list of issues.

amd-2016-crimson-relive-logo.png

While most of these problem were minor, the headlining fix could have been annoying for FreeSync users (until this update fixed it, of course). It turns out that, when using FreeSync with a borderless fullscreen application, and another monitor has an active window, such as a video in YouTube, the user would experience performance issues in the FreeSync application (unless all of these other windows were minimized). This sounds like a lot of steps, but you could imagine how many people have a YouTube or Twitch stream running while playing a semi-casual game. Also, those types of games lend themselves well to being run in borderless window mode, too, so you can easily alt-tab to the other monitors, exacerbating the issue. Regardless, it’s fixed now.

Other fixed issues involve mouse pointer corruption with an RX 480 and multi-GPU issues in Battlefield 1. You can download them at AMD's website.

Source: AMD