Subject: Graphics Cards | January 5, 2017 - 11:50 AM | Sebastian Peak
Tagged: video card, thunderbolt 3, msi, gus, graphics, external gpu, enclosure, CES 2017, CES
You would need to go all the way back to CES 2012 to see our coverage of the GUS II external graphics enclosure, and now MSI has a new G.U.S. (Graphics Upgrade System) GPU enclosure to show, this time using Thunderbolt 3.
In addition to 40 Gbps Thunderbolt 3 connectivity, the G.U.S. includes a built-in 500W power supply with 80 Plus Gold certification, as well as USB 3.0 Type-C and Type-A ports including a quick-charge port on the front of the unit.
Ryan had a look at the G.U.S. (running an NVIDIA GeForce GTX 1080, no less) at MSI's booth:
Specifications from MSI:
- 1x Thunderbolt 3 (40 Gbps) port to connect to host PCs
- 2x USB 3.0 Type-A (rear)
- 1x USB 3.0 Type-C (rear)
- 1x USB 3.0 Type-A w/QC (front)
- 80 Plus Gold 500W internal PSU
We do not have specifics on pricing or availablity for the G.U.S. just yet.
Follow all of our coverage of the show at https://pcper.com/ces!
High Bandwidth Cache
Apart from AMD’s other new architecture due out in 2017, its Zen CPU design, there is no other product that has had as much build up and excitement surrounding it than its Vega GPU architecture. After the world learned that Polaris would be a mainstream-only design that was released as the Radeon RX 480, the focus for enthusiasts came straight to Vega. It’s been on the public facing roadmaps for years and signifies the company’s return to the world of high end GPUs, something they have been missing since the release of the Fury X in mid-2015.
Let’s be clear: today does not mark the release of the Vega GPU or products based on Vega. In reality, we don’t even know enough to make highly educated guesses about the performance without more details on the specific implementations. That being said, the information released by AMD today is interesting and shows that Vega will be much more than simply an increase in shader count over Polaris. It reminds me a lot of the build to the Fiji GPU release, when the information and speculation about how HBM would affect power consumption, form factor and performance flourished. What we can hope for, and what AMD’s goal needs to be, is a cleaner and more consistent product release than how the Fury X turned out.
The Design Goals
AMD began its discussion about Vega last month by talking about the changes in the world of GPUs and how the data sets and workloads have evolved over the last decade. No longer are GPUs only worried about games, but instead they must address profession workloads, enterprise workloads, scientific workloads. Even more interestingly, as we have discussed the gap in CPU performance vs CPU memory bandwidth and the growing gap between them, AMD posits that the gap between memory capacity and GPU performance is a significant hurdle and limiter to performance and expansion. Game installs, professional graphics sets, and compute data sets continue to skyrocket. Game installs now are regularly over 50GB but compute workloads can exceed petabytes. Even as we saw GPU memory capacities increase from Megabytes to Gigabytes, reaching as high as 12GB in high end consumer products, AMD thinks there should be more.
Coming from a company that chose to release a high-end product limited to 4GB of memory in 2015, it’s a noteworthy statement.
The High Bandwidth Cache
Bold enough to claim a direct nomenclature change, Vega 10 will feature a HBM2 based high bandwidth cache (HBC) along with a new memory hierarchy to call it into play. This HBC will be a collection of memory on the GPU package just like we saw on Fiji with the first HBM implementation and will be measured in gigabytes. Why the move to calling it a cache will be covered below. (But can’t we call get behind the removal of the term “frame buffer”?) Interestingly, this HBC doesn’t have to be HBM2 and in fact I was told that you could expect to see other memory systems on lower cost products going forward; cards that integrate this new memory topology with GDDR5X or some equivalent seem assured.
Subject: Graphics Cards, Mobile | January 3, 2017 - 07:34 PM | Jeremy Hellstrom
Tagged: nvidia, GTX 1050 Ti, gtx 1050
Keep an eye out for reviews of new gaming laptops containing mobile versions of the GTX 1050 and 1050 Ti. These laptops should also be on sale soon, with a quote from NVIDIA suggesting prices will start at around $700 for a base model.
The price reflects the power of the GPU, you are not going to match a $2000 machine with a GTX 1080 in it, but then again there are many gamers who do not need such a powerful card. If your gaming machine is a current generation laptop with integrated graphics this will be a huge improvement and even a laptop with a discrete mid-range GPU from a previous generation is going to lag behind these new models. Of course, waiting to see what laptops based off of AMD's new Ryzen platform may be worth waiting for but for those hoping to upgrade soon, laptops with these cards installed are going to be worth looking at.
1080p gaming at 60fps will not be a problem, and for strategy games and online multiplayer entertainment such as LOL you should even be able to pump up the graphics settings. The cards will support the various GameWorks enhancements as well as other features such as Ansel.
The above example comes from The Verge, who spotted a 14" Dell laptop with the GTX 1050 already on sale at $800. If you want the best choice you should look to the 15.6" model, which offers the choice of a GTX 1050 or 1050 Ti, an i5-7300HQ or i7-7700HQ and a 512GB PCIe SSD.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards, Displays | January 3, 2017 - 09:00 AM | Ryan Shrout
Tagged: srgb, lfc, hdr10, hdr, freesync 2, freesync, dolby vision, color space, amd
Since the initial FreeSync launch in March of 2015, AMD has quickly expanded the role and impact that the display technology has had on the market. Technologically, AMD added low frame rate compensation (LFC) to mimic the experience of G-Sync displays, effectively removing the bottom limit to the variable refresh rate. LFC is an optional feature that requires a large enough gap between the displays minimum and maximum refresh rates to be enabled, but the monitors that do integrate it work well. Last year AMD brought FreeSync to HDMI connections too by overlaying the standard as an extension. This helped to expand the quantity and lower the price of available FreeSync options. Most recently, AMD announced that borderless windowed mode was being added as well, another feature-match to what NVIDIA can do with G-Sync.
The biggest feather in the cap for AMD FreeSync is the sheer quantity of displays that exist on the market that support it. As of our briefing in early December, AMD claimed 121 design wins for FreeSync to just 18 for NVIDIA G-Sync. I am not often in the camp of quantity over quality, but the numbers are impressive. The pervasiveness of FreeSync monitors means that at least some of them are going to be very high quality integrations and that prices are going to be lower compared to the green team’s selection.
Today AMD is announcing FreeSync 2, a new, concurrently running program that adds some new qualifications to displays for latency, color space and LFC. This new program will be much more hands-on from AMD, requiring per-product validation and certification and this will likely come at a cost. (To be clear, AMD hasn’t confirmed if that is the case to me yet.)
Let’s start with the easy stuff first: latency and LFC. FreeSync 2 will require monitors to support LFC and thus to have no effective bottom limit to their variable refresh rate. AMD will also instill a maximum latency allowable for FS2, on the order of “a few milliseconds” from frame buffer flip to photon. This can be easily measured with some high-speed camera work by both AMD and external parties (like us).
These are fantastic additions to the FreeSync 2 standard and should drastically increase the quality of panels and product.
The bigger change to FreeSync 2 is on the color space. FS2 will require a doubling of the perceivable brightness and doubling of the viewable color volume based on the sRGB standards. This means that any monitor that has the FreeSync 2 brand will have a significantly larger color space and ~400 nits brightness. Current HDR standards exceed these FreeSync 2 requirements, but there is nothing preventing monitor vendors from exceeding these levels; they simply set a baseline that users should expect going forward.
In addition to just requiring the panel to support a wider color gamut, FS2 will also enable user experience improvements as well. First, each FS2 monitor must communicate its color space and brightness ranges to the AMD driver through a similar communication path used today for variable refresh rate information. By having access to this data, AMD can enable automatic mode switches from SDR to HDR/wide color gamut based on the application. Windows can remain in a basic SDR color space but games or video applications that support HDR modes can enter that mode without user intervention.
Color space mapping can take time in low power consumption monitors, adding potential latency. For movies that might not be an issue, but for enthusiast gamers it definitely is. The solution is to do all the tone mapping BEFORE the image data is sent to the monitor itself. But with varying monitors, varying color space limits and varying integrations of HDR standards, and no operating system level integration for tone mapping, it’s a difficult task.
The solution is for games to map directly to the color space of the display. AMD will foster this through FreeSync 2 – a game that integrates support for FS2 will be able to get data from the AMD driver stack about the maximum color space of the attached display. The engine can then do its tone mapping to that color space directly, rather than some intermediate state, saving on latency and improving the gaming experience. AMD can then automatically switch the monitor to its largest color space, as well as its maximum brightness. This does require the game engine or game developer to directly integrate support for this feature though – it will not be a catch-all solution for AMD Radeon users.
This combination of latency, LFC and color space additions to FreeSync 2 make it an incredibly interesting standard. Pushing specific standards and requirements on hardware vendors is not something AMD has had the gall to do the past, and honestly the company has publicly been very against it. But to guarantee the experience for Radeon gamers, AMD and the Radeon Technologies Group appear to be willing to make some changes.
NVIDIA has yet to make any noise about HDR or color space requirements for future monitors and while the FreeSync 2 standards shown here don’t quite guarantee HDR10/Dolby Vision quality displays, they do force vendors to pay more attention to what they are building and create higher quality products for the gaming market.
All GPUs that support FreeSync will support FreeSync 2 and both programs will co-exist. FS2 is currently going to be built on DisplayPort and could find its way into another standard extension (as Adaptive Sync was). Displays are set to be available in the first half of this year.
Follow all of our coverage of the show at http://pcper.com/ces!
Subject: Graphics Cards | January 2, 2017 - 02:56 PM | Scott Michaud
Tagged: Vega, amd
Just ahead of CES, AMD has published a teaser page with, currently, a single YouTube video and a countdown widget. In the video, a young man is walking down the street while tapping on a drum and passing by Red Team propaganda posters. It also contains subtle references to Vega on walls and things, in case the explicit references, including the site’s URL, weren’t explicit enough.
How subtle, AMD.
Speaking of references to Vega, the countdown widget claims to lead up to the architecture preview. We were expecting AMD to launch their high-end GPU line at CES, and this is the first (official) day of the show. Until it happens, I don’t really know whether it will be a more technical look, or if they will be focusing on the use cases.
The countdown ends at 9am (EST) on January 5th.
Subject: Graphics Cards | December 22, 2016 - 07:01 AM | Scott Michaud
Tagged: nvidia, graphics drivers
The latest hotfix drivers from NVIDIA, 376.48, address five issues, some of which were long-standing complaints. The headlining bug is apparently a workaround for an issue in Folding@Home, until the application patches the root issue on its end. Prior to this, users needed to stick on 373.06 in order to successfully complete a Folding@Home run, avoiding all drivers since mid-October.
Other fixes include rendering artifacts in Just Cause 3, flickering and crashes in Battlefield 1, and rendering issues in Wargame: Red Dragon. These drivers, like all hotfix drivers, will not be pushed by GeForce Experience. You will need to download them from NVIDIA’s support page.
Subject: Graphics Cards | December 20, 2016 - 02:41 PM | Scott Michaud
Tagged: graphics drivers, amd
Last week, AMD came down to the PC Perspective offices to show off their new graphics drivers, which introduced optional game capture software (and it doesn’t require a login to operate). This week, they are publishing a new version of it, Radeon Software Crimson ReLive Edition 16.12.2, which fixes a huge list of issues.
While most of these problem were minor, the headlining fix could have been annoying for FreeSync users (until this update fixed it, of course). It turns out that, when using FreeSync with a borderless fullscreen application, and another monitor has an active window, such as a video in YouTube, the user would experience performance issues in the FreeSync application (unless all of these other windows were minimized). This sounds like a lot of steps, but you could imagine how many people have a YouTube or Twitch stream running while playing a semi-casual game. Also, those types of games lend themselves well to being run in borderless window mode, too, so you can easily alt-tab to the other monitors, exacerbating the issue. Regardless, it’s fixed now.
Other fixed issues involve mouse pointer corruption with an RX 480 and multi-GPU issues in Battlefield 1. You can download them at AMD's website.
Subject: Graphics Cards | December 19, 2016 - 01:36 PM | Jeremy Hellstrom
Tagged: rx 480, asus, ROG STRIX RX 480 O8G GAMING, factory overclocked, DirectCU III, Polaris
ASUS went all out with the design of the ROG STRIX RX 480 O8G GAMING card, almost nothing on this card remains a stock part from cooling to power regulation. It ships with a GPU clocked at 1310MHz in GAMING mode and 1330MHz in OC mode, [H]ard|OCP reached a stable 1410MHz GPU and 8.8GHz VRAM when manually overclocking. The card does come at a premium, roughly $50 more than a stock card which makes it more expensive than NVIDIA's GTX 1060. The bump in frequency helps narrow the performance gap between the two cards, but it doesn't make this RX 480 a clear winner. Check out the full review to see how it performs in the games which matter to you.
"We put the ASUS ROG STRIX RX 480 O8G GAMING video card through the wringer to find out how well this video card performs and overclocks against a highly overclocked MSI GTX 1060 GAMING X video card. Check out the highest overclock and fastest performance we’ve ever achieved to date with and AMD Radeon RX 480."
Here are some more Graphics Card articles from around the web:
- PowerColor Devil Box Redux Review @ OCC
- XFX RX 480 XXX GTR Review @ OCC
- 2016 End-of-Year Open-Source Radeon Benchmarks With Linux 4.9, Mesa 13.1-dev On Many Different GPUs @ Phoronix
- Zotac GTX 1080 AMP! Extreme 8 GB @ techPowerUp
Subject: Graphics Cards | December 12, 2016 - 04:05 PM | Jeremy Hellstrom
Tagged: vega 10, Vega, training, radeon, Polaris, machine learning, instinct, inference, Fiji, deep neural network, amd
Ryan was not the only one at AMD's Radeon Instinct briefing, covering their shot across NVIDIA's HPC products. The Tech Report just released their coverage of the event and the tidbits which AMD provided about the MI25, MI8 and MI6; no relation to a certain British governmental department. They focus a bit more on the technologies incorporated into GEMM and point out that AMD's top is not matched by an NVIDIA product, the GP100 GPU does not come as an add-in card. Pop by to see what else they had to say.
"Thus far, Nvidia has enjoyed a dominant position in the burgeoning world of machine learning with its Tesla accelerators and CUDA-powered software platforms. AMD thinks it can fight back with its open-source ROCm HPC platform, the MIOpen software libraries, and Radeon Instinct accelerators. We examine how these new pieces of AMD's machine-learning puzzle fit together."
Here are some more Graphics Card articles from around the web:
- The Complete AMD Radeon Instinct Tech Briefing @ Tech ARP
- Chill With Radeon Software Crimson ReLive Edition @ Techgage
- Radeon Software Crimson ReLive Edition—an overview @ The Tech Report
- AMD Radeon Crimson ReLive Drivers @ techPowerUp
- AMD talk to KitGuru about Crimson ReLive
- We retest Radeon Chill 2 The Tech Report
- MSI RX 480 Gaming X 8G Review @ OCC
- NVIDIA GeForce GTX 1080 PCI-Express Scaling @ techPowerUp
Subject: General Tech, Graphics Cards | December 12, 2016 - 01:31 PM | Jeremy Hellstrom
Tagged: gaming, nvidia, geforce, htc vive, VR, game bundle
AMD's RX 480 and Fury X are capable of providing decent performance in VR applications and will save you some money for the VR headset, dongles and games. However NVIDIA upped the ante today, giving away three games to anyone who purchases a GTX 1080, 1070 or 1060 and an HTC Vive.
The giveaway encompasses more than North America, as long as you can purchase the bundle from either Microsoft or NewEgg where you happen to live you should be able to get your three free games. They are redeemable on Steam and should be available immediately, a peek at Sports Bar VR is below.